내 pc 사양 가져온 뒤 로컬에서 구동하기 적당한 deep seek r1 모델 추천하는 파이썬 코드

2025. 2. 1. 23:06카테고리 없음

반응형

import platform
import psutil
import GPUtil

def get_system_specs():
    """
    시스템의 기본 사양 정보를 수집하여 딕셔너리로 반환합니다.
    """
    specs = {}

    # 운영체제, 머신, 프로세서 정보
    specs["os"] = platform.system() + " " + platform.release()
    specs["machine"] = platform.machine()
    specs["processor"] = platform.processor()

    # CPU 코어 개수
    specs["cpu_count_logical"] = psutil.cpu_count(logical=True)
    specs["cpu_count_physical"] = psutil.cpu_count(logical=False)

    # 메모리 정보 (GB 단위)
    vm = psutil.virtual_memory()
    specs["total_ram_gb"] = round(vm.total / (1024**3), 2)

    # GPU 정보 (GPU가 없으면 빈 리스트)
    gpus = GPUtil.getGPUs()
    gpu_list = []
    for gpu in gpus:
        gpu_info = {
            "id": gpu.id,
            "name": gpu.name,
            "load": gpu.load,
            "memory_total_gb": round(gpu.memoryTotal / 1024, 2),  # MB -> GB
            "memory_used_gb": round(gpu.memoryUsed / 1024, 2),
            "memory_free_gb": round(gpu.memoryFree / 1024, 2)
        }
        gpu_list.append(gpu_info)
    specs["gpus"] = gpu_list

    return specs

def recommend_deep_seek_model(specs):
    """
    수집한 시스템 사양에 따라 추천할 Deep Seek R1 모델을 결정합니다.
    
    예시 조건 (사용자 환경에 맞게 조정 필요):
      - GPU가 없는 경우: 전체 RAM에 따라 추천
          * RAM < 8GB       → "1.5b"
          * 8GB ≤ RAM < 16GB → "7b"
          * 16GB ≤ RAM < 32GB→ "8b"
          * 32GB ≤ RAM < 64GB→ "14b"
          * RAM ≥ 64GB      → "32b"
      
      - GPU가 있을 경우: GPU의 총 메모리(max_gpu_mem)를 우선적으로 고려
          * max_gpu_mem < 4GB             → "1.5b"
          * 4GB ≤ max_gpu_mem < 6GB       → "7b"
          * 6GB ≤ max_gpu_mem < 8GB       → "8b"
          * 8GB ≤ max_gpu_mem < 10GB      → "14b"
          * 10GB ≤ max_gpu_mem < 16GB     → "32b"
          * 16GB ≤ max_gpu_mem < 24GB 또는 RAM < 64GB → "70b"
          * max_gpu_mem ≥ 24GB 및 RAM ≥ 64GB→ "671b"
    """
    total_ram = specs["total_ram_gb"]

    # GPU가 있을 경우 최대 GPU 메모리 확인 (없으면 0)
    if specs["gpus"]:
        max_gpu_mem = max(gpu["memory_total_gb"] for gpu in specs["gpus"])
    else:
        max_gpu_mem = 0

    # GPU가 없는 경우
    if not specs["gpus"]:
        if total_ram < 8:
            return "1.5b"
        elif total_ram < 16:
            return "7b"
        elif total_ram < 32:
            return "8b"
        elif total_ram < 64:
            return "14b"
        else:
            return "32b"
    else:
        # GPU가 있는 경우 GPU 메모리 기준으로 추천 (추가로 전체 RAM 조건도 확인)
        if max_gpu_mem < 4:
            return "1.5b"
        elif max_gpu_mem < 6:
            return "7b"
        elif max_gpu_mem < 8:
            return "8b"
        elif max_gpu_mem < 10:
            return "14b"
        elif max_gpu_mem < 16:
            return "32b"
        elif max_gpu_mem < 24 or total_ram < 64:
            return "70b"
        else:
            return "671b"

def main():
    specs = get_system_specs()
    
    # 시스템 사양 출력
    print("시스템 사양:")
    print(f"  운영체제: {specs['os']}")
    print(f"  머신: {specs['machine']}")
    print(f"  프로세서: {specs['processor']}")
    print(f"  물리적 CPU 코어: {specs['cpu_count_physical']}, 논리적 CPU 코어: {specs['cpu_count_logical']}")
    print(f"  총 RAM: {specs['total_ram_gb']} GB")
    
    if specs["gpus"]:
        print("  GPU 정보:")
        for gpu in specs["gpus"]:
            print(f"    - {gpu['name']} (총 메모리: {gpu['memory_total_gb']} GB, 사용 가능 메모리: {gpu['memory_free_gb']} GB)")
    else:
        print("  GPU 정보: GPU가 감지되지 않음")
    
    # 추천 모델 결정
    recommended_model = recommend_deep_seek_model(specs)
    print("\n추천하는 Deep Seek R1 모델:", recommended_model)

if __name__ == "__main__":
    main()

반응형