This repository was archived by the owner on Dec 16, 2025. It is now read-only.
Update count_tokens.py#459
Merged
logankilpatrick merged 1 commit intogoogle-gemini:mainfrom Jul 12, 2024
Merged
Conversation
- integrated returns into main snippet - updated code comments - pulled text of prompts out of the requests to generate_content
logankilpatrick
approved these changes
Jul 12, 2024
eliben
reviewed
Jul 12, 2024
|
|
||
| # Returns the "context window" for the model, | ||
| # which is the combined input and output token limits. | ||
| print(f"{model_info.input_token_limit=}") |
There was a problem hiding this comment.
Here the output is printed as key=value, whereas in most other places it's key: value.
Is there some system/method behind this?
| # (`prompt_token_count` and `candidates_token_count`, respectively), | ||
| # as well as the combined token count (`total_token_count`). | ||
| print(response.usage_metadata) | ||
| # ( prompt_token_count: 11, candidates_token_count: 73, total_token_count: 84 ) |
There was a problem hiding this comment.
Note that the token count differs between the two approaches: 10 vs 11
| # (`prompt_token_count` and `candidates_token_count`, respectively), | ||
| # as well as the combined token count (`total_token_count`). | ||
| print(response.usage_metadata) | ||
| # ( prompt_token_count: 264, candidates_token_count: 80, total_token_count: 345 ) |
There was a problem hiding this comment.
Some off-by-1 issues here: prompt token count 263 vs 264
Also 264+80 != 345
| # Optionally, you can call `count_tokens` for the prompt and file separately. | ||
| prompt = "Please give a short summary of this file." | ||
|
|
||
| # Call `count_tokens` to get input token count |
There was a problem hiding this comment.
This sample uploads a text file, not a video - should the comment be amended?
eliben
added a commit
to google/generative-ai-go
that referenced
this pull request
Jul 12, 2024
Aligning with the Python samples per google-gemini/deprecated-generative-ai-python#459 Main change is in the way output of snippets is marked
TechRanger101
added a commit
to TechRanger101/Generative-AI-GoLang
that referenced
this pull request
Nov 27, 2024
Aligning with the Python samples per google-gemini/deprecated-generative-ai-python#459 Main change is in the way output of snippets is marked
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description of the change
Motivation
Alignment and clarification of docs snippets
Type of change
Documentation - code snippets
Checklist
git pull --rebase upstream main).