Be mindful of the environmental impact
From raw material extraction, through the manufacturing of the required GPUs and other hardware, up to model deployment the embodied emissions, dynamic energy consumption, and idle energy consumption of Large Language Models runs about 50 tonnes of carbon emissions.
Luccioni, A. S., Viguier, S., & Ligozat, A. L. (2023). Estimating the carbon footprint of BLOOM, a 176B parameter language model. Journal of Machine Learning Research, 24(253), 1-15.