Using artificial intelligence (AI)/ large language models (LLMs) with Perovskite solar cells optimization

Semester/Master Project

Project 4

In this project, the focus is on using different models developed for calculating various objective function values like cost and environmental impacts (based on machine learning and physics-driven concepts) for multi-objective optimization of perovskite solar cells.

The project will be structured in the following parts:

  • Developing complete mathematical fromework for optimization problem
  • Developing an understanding of the existing models for different KPIs calculation
  • Integrating the different models together for calculating various KPIs
  • Performing optimization for realizing the complete recipes for the perovskite solar cells depending on different KPIs

Skills

  • Interest and understanding of PV technologies and other energy technologies
  • independent and motivated
  • Coding skills in Python or other language are necessary
  • Results interpretation and report writing
  • Language skills: English (C1/C2 level)
  • Systematic thinker and problem-solver oriented
  • Background: Data science, Machine learning, Micro engineering, Energy science, others

Lectures: - Applied machine learning/ machine learning/ Artificial intelligence - Fundamentals & processes for photovoltaic devices - Energy conversion and renewable energy

Supervision

If interested, please contact Naveen Bhati (naveen.bhati@epfl.ch) attaching your CV, Cover Letter and transcript of records (Bachelor’s and Master’s). Short-listed candidates will be interviewed. Early applications are encouraged

Practical information

The IPESE laboratory is located in the Sion EPFL campus. Travels between Lausanne and Sion are compensated by EPFL.

References:

  1. Xie, T., Wa, Y., Huang, W., Zhou, Y., Liu, Y., Linghu, Q., … & Hoex, B. (2023). Large Language Models as Master Key: Unlocking the Secrets of Materials Science with GPT. arXiv preprint arXiv:2304.02213. https://arxiv.org/pdf/2304.02213.pdf
  2. https://pubs.aip.org/aip/aml/article/1/1/010901/2878738/Deep-language-models-for-interpretative-and
  3. Flam-Shepherd, D., Zhu, K., & Aspuru-Guzik, A. (2022). Language models can learn complex molecular distributions. Nature Communications, 13(1), 3293. https://www.nature.com/articles/s41467-022-30839-x
  4. Brown, T., Mann, B., Ryder, N., Subbiah, M., Kaplan, J. D., Dhariwal, P., … & Amodei, D. (2020). Language models are few-shot learners. Advances in neural information processing systems, 33, 1877-1901. https://proceedings.neurips.cc/paper/2020/file/1457c0d6bfcb4967418bfb8ac142f64a-Paper.pdf
  5. Gao, T., Fisch, A., & Chen, D. (2020). Making pre-trained language models better few-shot learners.arXiv preprint arXiv:2012.15723. https://arxiv.org/abs/2012.15723