Using artificial intelligence (AI)/ large language models (LLMs) with Perovskite solar cells

past

Semester/Master project

Context

In this project, the focus is on using NLP models (LSTMs, BERT, etc) and LLMs for exploratory data analysis on the developed PSCs database while adding more features to the dataset by connecting to different online databases and also collecting the data from existing literature. This will then be used to extract meaningful relationship from the database which can help in improving the efficiency and stability of the overall cells and also analysing the predictive capabilities of these models. The proposed project is open to adaptation based on evolving AI domain and also the final performance indicators (like stability).

Project

The project will be structured in the following parts:

  • Developing an understanding of the existing database
  • Using different AI models to interact with existing database

Skills

  • Interest and understanding of PV technologies and other energy technologies
  • independent and motivated
  • Coding skills in Python or other language are necessary
  • Results interpretation and report writing
  • Language skills: English (C1/C2 level)
  • Systematic thinker and problem-solver oriented
  • Background: Data science, Machine learning, Micro engineering, Energy science, others

Lectures: - Applied machine learning/ machine learning/ Artificial intelligence - Fundamentals & processes for photovoltaic devices - Energy conversion and renewable energy

Supervision

If interested, please contact Naveen Bhati (naveen.bhati@epfl.ch) attaching your CV, Cover Letter and transcript of records (Bachelor’s and Master’s). Short-listed candidates will be interviewed. Early applications are encouraged

Practical information

The IPESE laboratory is located in the Sion EPFL campus. Working in Sion office or remotely depends on Covid situation. Travels between Lausanne and Sion are compensated by EPFL.

References:

  1. Xie, T., Wa, Y., Huang, W., Zhou, Y., Liu, Y., Linghu, Q., … & Hoex, B. (2023). Large Language Models as Master Key: Unlocking the Secrets of Materials Science with GPT. arXiv preprint arXiv:2304.02213. https://arxiv.org/pdf/2304.02213.pdf
  2. https://pubs.aip.org/aip/aml/article/1/1/010901/2878738/Deep-language-models-for-interpretative-and
  3. Flam-Shepherd, D., Zhu, K., & Aspuru-Guzik, A. (2022). Language models can learn complex molecular distributions. Nature Communications, 13(1), 3293. https://www.nature.com/articles/s41467-022-30839-x
  4. Brown, T., Mann, B., Ryder, N., Subbiah, M., Kaplan, J. D., Dhariwal, P., … & Amodei, D. (2020). Language models are few-shot learners. Advances in neural information processing systems, 33, 1877-1901. https://proceedings.neurips.cc/paper/2020/file/1457c0d6bfcb4967418bfb8ac142f64a-Paper.pdf
  5. Gao, T., Fisch, A., & Chen, D. (2020). Making pre-trained language models better few-shot learners.arXiv preprint arXiv:2012.15723. https://arxiv.org/abs/2012.15723