케플러가 600/700 시리즈니 요즘 시대에는 다 된다고 봐야할 듯.
-pl, --power-limit=POWER_LIMITSpecifies maximum power limit in watts. Accepts integer and floating point numbers. it takes an optional argument --scope. Only on supported devices from Kepler family. Value needs to be between Min and Max Power Limit as reported by nvidia-smi. Requires root.-sc, --scope=0/GPU, 1/TOTAL_MODULESpecifies the scope of the power limit. Following are the options: 0/GPU: This only changes power limits for the GPU. 1/Module: This changes the power limits for the module containing multiple components. E.g. GPU and CPU. |
[링크 : https://docs.nvidia.com/deploy/nvidia-smi/]
[링크 : https://developer0hye.tistory.com/690]
+
2026.04.24
범위를 어떻게 확인하지?
| C:\>nvidia-smi -pl 100 Provided power limit 100.00 W is not a valid power limit which should be between 125.00 W and 300.00 W for GPU 00000000:01:00.0 Terminating early due to previous errors. C:\>nvidia-smi -pl 125 Failed to set power management limit for GPU 00000000:01:00.0: Insufficient Permissions Terminating early due to previous errors. |
아무튼 관리자 권한으로 실행된 cmd 창에서만 변경된다.
| C:\>nvidia-smi -pl 125 Power limit for GPU 00000000:01:00.0 was set to 125.00 W from 250.00 W. All done. C:\>nvidia-smi -pl 250 Power limit for GPU 00000000:01:00.0 was set to 250.00 W from 125.00 W. All done. C:\>nvidia-smi -pl 125 Power limit for GPU 00000000:01:00.0 was set to 125.00 W from 250.00 W. All done. |
'프로그램 사용 > ai 프로그램' 카테고리의 다른 글
| RAG - Retrieval-Augmented Generation (0) | 2026.04.24 |
|---|---|
| llama.cpp on ubuntu with 1060 6GB (0) | 2026.04.23 |
| llama.cpp 와 ollama 성능 비교.. (cpu는 차이가 없?) (0) | 2026.04.22 |
| llama.cpp 도전! (0) | 2026.04.22 |
| unsloth ai (0) | 2026.04.21 |
