Package: edgemodelr
Type: Package
Title: Local Language Model Inference
Version: 0.1.0
Authors@R: person("Pawan Rama", "Mali", email = "prm@outlook.in", role = c("aut", "cre"))
Author: Pawan Rama Mali [aut, cre]
Maintainer: Pawan Rama Mali <prm@outlook.in>
Description: Enables R users to run large language models locally using 'GGUF' model files
    and the 'llama.cpp' inference engine. Provides a complete R interface for loading models,
    generating text completions, and streaming responses in real-time. Supports local
    inference without requiring cloud APIs or internet connectivity, ensuring complete
    data privacy and control. References: 'Gerganov' et al. (2023) <https://github.com/ggml-org/llama.cpp>.
License: MIT + file LICENSE
URL: https://github.com/PawanRamaMali/edgemodelr
BugReports: https://github.com/PawanRamaMali/edgemodelr/issues
Encoding: UTF-8
Depends: R (>= 4.0)
LinkingTo: Rcpp
Imports: Rcpp (>= 1.0.0), utils, tools
Suggests: testthat (>= 3.0.0), knitr, rmarkdown
SystemRequirements: C++17, GNU make or equivalent for building
Note: This package includes a self-contained 'llama.cpp' implementation
        resulting in a larger installation size (~56MB) to provide
        complete functionality without external dependencies.
Config/testthat/edition: 3
RoxygenNote: 7.3.3
NeedsCompilation: yes
Packaged: 2025-09-17 19:22:58 UTC; aeroe
Repository: CRAN
Date/Publication: 2025-09-22 12:00:08 UTC
