•   over 1 year ago

What are the best ways to use NVIDIA AI Workbench without GPU access or sufficient local resources?

Many developers face limitations in using NVIDIA AI Workbench due to lack of GPU access, local storage, or computational power. What strategies or solutions can the community recommend to optimize the use of the platform under these constraints? Are there cloud options or best practices that could help?

  • 4 comments

  • Manager   •   over 1 year ago

    Hi Muhammad - This is an excellent question, and thanks for bringing it up.

    If you don't have sufficient local resources, you can run AI Workbench on a remote and then connect it to your local Workbench.
    You can see how to do in our docs.
    Remote install is here: https://docs.nvidia.com/ai-workbench/user-guide/latest/installation/ubuntu-remote.html
    Connecting is here: https://docs.nvidia.com/ai-workbench/user-guide/latest/locations/remote.html

    However, this requires access to a remote, typically cloud, that has a sufficient GPU.

    Do you have access to GPUs in the cloud?

    Sincerely,
    Tyler

  •   •   over 1 year ago

    Hey Tyler. I have the same issue. I do not have access to GPU and I do not have a cloud machine that has GPUs. So, I have been using APIs to run inferences. Will this be acceptable?

  •   •   over 1 year ago

    Hey Tyler,

    Unfortunately, No, I don't have access to cloud GPU.

  • Manager   •   over 1 year ago

    Yes, APIs are acceptable. However, the application should be adaptable to use GPUs on a host, like a workstation.

Comments are closed.