This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
What are two key components of transformer architecture that support today's generative AI?
Recurrent Neural Networks (RNNs) and memory retention
Attention and positional encoding
Prompt engineering and groundedness
What is the main difference between Large Language Models and Small Language Models?
Large Language Models are trained with vast quantities of text. The text represents a wide range of general subject matter, while Small Language Models are trained with smaller, more subject-focused datasets.
Large Language Models are trained to include an understanding of context, while Small Language Models aren't.
Large Language Models have fewer parameters than Small Language Models.
What is the purpose of fine-tuning in the context of generative AI?
It's used to manage access, authentication, and data usage in AI models.
It involves connecting a language model to an organization's proprietary database.
It involves further training a pretrained model on a task-specific dataset to make it more suitable for a particular application.
What are the four stages in the process of developing and implementing a plan for responsible AI when using generative models according to Microsoft's guidance?
Identify potential benefits, Measure the benefits, Enhance the benefits, Operate the solution responsibly
Identify potential harms, Measure these harms, Mitigate the harms, Operate the solution responsibly
Define the problem, Design the solution, Develop the solution, Deploy the solution
You must answer all questions before checking your work.
Was this page helpful?