微软亚洲研究院与微软总部联合推出的 “星跃计划” 科研合作项目邀请你来报名!来自微软体验与设备(Experiences + Devices)全球总部应用研究团队的项目再次上新,欢迎大家关注与申请。加入 “星跃计划”,和我们一起跨越重洋,探索科研的更多可能!
该计划旨在为优秀人才创造与微软全球总部的研究团队一起聚焦真实前沿问题的机会。你将在国际化的科研环境中、在多元包容的科研氛围中、在顶尖研究员的指导下,做有影响力的研究。
本次招募的跨研究院联合科研项目聚焦自然语言处理领域,研究项目为 Knowledge Patching with Large Language Model。星跃计划开放项目将持续更新,请及时关注获取最新动态!
同时在微软亚洲研究院、微软全球总部顶级研究员的指导下进行科研工作,与不同研究背景的科研人员深度交流
聚焦来自于工业界的真实前沿问题,致力于做出对学术及产业界有影响力的成果
通过线下与线上的交流合作,在微软了解国际化、开放的科研氛围,及多元与包容的文化
硕士、博士在读学生(具体参考项目要求);延期(deferred)或间隔年(gap year)学生 可全职在国内工作 6-12 个月 各项目详细要求详见下方项目介绍
Knowledge Patching with Large Language Model
点击此处向上滑动阅览
Large generative pretrained transformers (e.g., GPT-3.5, ChatGPT) have shown surprisingly good performance in language modeling and have been successfully productized in long-form code generation (i.e., Codex for Co-pilot). However, for long-form natural language generation (e.g., writing an email for an office worker), the performance of large language model (LLM) is currently far from perfect because it doesn't have specific domain knowledge and user-specific information that are both critically needed to write appropriate content in the context.
To address the challenge to support the long-form natural language generation feature, we propose to equip the LLM with a novel knowledge patch to allow the LLM to digest necessary knowledge at runtime without parameter tuning. Our proposed knowledge patch is a non-parametric module that can store any knowledge we want LLM to digest runtime. It can be majorly built offline efficiently (e.g., it can be built based on the content of user's previous documents and mails) and easy to maintain (e.g., add new knowledge or remove old knowledge). Like patches to operating systems, our proposed knowledge patch can not only keep the LLM always updated, but also enable the model to become a personalized writing assistant that can really write to meet the user's need. We believe the LLM with our proposed knowledge patch will be the key modeling technology to provide long-form generation.
Research Areas
Natural Language Processing
Qualifications
Ph.D. students majoring in computer science, electronic engineering, or related areas
Self-motivated and passionate in research
Solid coding skills
Experienced in Natural Language Processing
符合条件的申请者请填写下方申请表:
https://jinshuju.net/f/TpaCqG
或扫描下方二维码,立即填写进入申请!
你也许还想看