hi,
im currently using nvcr.io/nvidia/tritonserver:25.09-trtllm-python-py3
is there a way to use tensorrt and tensorrt-llm using the same image? i have a gemma3 and modernbert model i want to use on a single gpu using the same docker image.
cc @lix19937