You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: adev-ja/src/content/ai/develop-with-ai.en.md
+4Lines changed: 4 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -39,3 +39,7 @@ The Angular CLI includes an experimental [Model Context Protocol (MCP) server](h
39
39
* <ahref="/context/llm-files/llms-full.txt"target="_blank">llms-full.txt</a> - a more robust compiled set of resources describing how Angular works and how to build Angular applications.
40
40
41
41
Be sure [to check out the overview page](/ai) for more information on how to integrate AI into your Angular applications.
42
+
43
+
## Web Codegen Scorer
44
+
The Angular team developed and open-sourced the [Web Codegen Scorer](https://github.com/angular/web-codegen-scorer), a tool to evaluate and score the quality of AI generated web code. You can use this tool to make evidence-based decisions relating to AI-generated code, such as fine-tuning prompts to improve the accuracy of LLM-generated code for Angular. These prompts can be included as system instructions for your AI tooling or as context with your prompt. You can also use this tool to compare the quality of code produced by different models and monitor quality over time as models and agents evolve.
Copy file name to clipboardExpand all lines: adev-ja/src/content/ai/overview.en.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -54,7 +54,7 @@ Here is an example of how to build with Firebase AI Logic and Angular:
54
54
This example includes an [in-depth video walkthrough explaining the functionality and demonstrates how to add new features](https://youtube.com/live/4vfDz2al_BI).
55
55
56
56
### Build AI-powered applications with Gemini API and Angular
57
-
The [Gemini API](https://ai.google.dev/gemini-api/docs) provides access to state-of-the-art models from Google that supports audio, images, video, and text input. The models that are optimized for specific use cases, [learn more on the Gemini API documentation site](https://ai.google.dev/gemini-api/docs/models).
57
+
The [Gemini API](https://ai.google.dev/gemini-api/docs) provides access to state-of-the-art models from Google that support audio, images, video, and text input. These models that are optimized for specific use cases, [learn more on the Gemini API documentation site](https://ai.google.dev/gemini-api/docs/models).
58
58
59
59
*[AI Text Editor Angular app template](https://github.com/FirebaseExtended/firebase-framework-tools/tree/main/starters/angular/ai-text-editor) - Use this template to start with a fully functioning text editor with AI-powered features like refining text, expanding text and formalizing text. This is a good starting point to gain experience with calling the Gemini API via HTTP.
60
60
@@ -64,7 +64,7 @@ The [Gemini API](https://ai.google.dev/gemini-api/docs) provides access to state
64
64
### Connecting to model providers and keeping your API Credentials Secure
65
65
When connecting to model providers, it is important to keep your API secrets safe. *Never put your API key in a file that ships to the client, such as `environments.ts`*.
66
66
67
-
Your application's architecture determines which AI APIs and tools to choose. Specifically, choose based on whether or not your application is client-side or server-side. Tools such as Firebase AI Logic provide a secure connection to the model APIs for client-side code. If you want to use a different API than Firerbase AI Logic or prefer to use a different model provider, consider creating a proxy-server or even [Cloud Functions for Firebase](https://firebase.google.com/docs/functions) to serve as a proxy and not expose your API keys.
67
+
Your application's architecture determines which AI APIs and tools to choose. Specifically, choose based on whether or not your application is client-side or server-side. Tools such as Firebase AI Logic provide a secure connection to the model APIs for client-side code. If you want to use a different API than Firebase AI Logic or prefer to use a different model provider, consider creating a proxy-server or even [Cloud Functions for Firebase](https://firebase.google.com/docs/functions) to serve as a proxy and not expose your API keys.
68
68
69
69
For an example of connecting using a client-side app, see the code: [Firebase AI Logic Angular example repository](https://github.com/angular/examples/tree/main/vertex-ai-firebase-angular-example).
70
70
@@ -88,7 +88,7 @@ Because models can return non-deterministic results, your applications should be
88
88
Even considering these strategies and techniques, sensible fallbacks should be incorporated in your application design. Follow existing standards of application resiliency. For example, it is not acceptable for an application to crash if a resource or API is not available. In that scenario, an error message is displayed to the user and, if applicable, options for next steps are also displayed. Building AI-powered applications requires the same consideration. Confirm that the response is aligned with the expected output and provide a "safe landing" in case it is not aligned by way of [graceful degradation](https://developer.mozilla.org/en-US/docs/Glossary/Graceful_degradation). This also applies to API outages for LLM providers.
89
89
90
90
Consider this example: The LLM provider is not responding. A potential strategy to handle the outage is:
91
-
* Save the response from the user to used in a retry scenario (now or at a later time)
91
+
* Save the response from the user to use in a retry scenario (now or at a later time)
92
92
* Alert the user to the outage with an appropriate message that doesn't reveal sensitive information
93
93
* Resume the conversation at a later time once the services are available again.
0 commit comments