-
Notifications
You must be signed in to change notification settings - Fork 107
chore: Updating lc image to flex #1795
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
gavin-aguiar
wants to merge
10
commits into
dev
Choose a base branch
from
gaaguiar/flex_tests
base: dev
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
Show all changes
10 commits
Select commit
Hold shift + click to select a range
0a1ade4
Updating lc image to flex
gavin-aguiar b8dfeee
Fixing test
gavin-aguiar 6f777d1
Mounting proxy worker for 3.13 tests
gavin-aguiar 8449783
Debug logging fix
gavin-aguiar 5fb3177
Adding py314
gavin-aguiar bfd59ee
Updated image repo for flexconsumption
gavin-aguiar 888f387
Merge branch 'dev' of github.com:Azure/azure-functions-python-worker …
gavin-aguiar 7736488
Merge branch 'dev' into gaaguiar/flex_tests
gavin-aguiar 28bc120
Token updates for flex
gavin-aguiar ac43611
erge branch 'gaaguiar/flex_tests' of github.com:Azure/azure-functions…
gavin-aguiar File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -86,6 +86,7 @@ dev = [ | |
| "pre-commit", | ||
| "invoke", | ||
| "cryptography", | ||
| "pyjwt", | ||
| "jsonpickle", | ||
| "orjson" | ||
| ] | ||
|
|
||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -26,8 +26,12 @@ | |
| # Linux Consumption Testing Constants | ||
| _DOCKER_PATH = "DOCKER_PATH" | ||
| _DOCKER_DEFAULT_PATH = "docker" | ||
| _MESH_IMAGE_URL = "https://mcr.microsoft.com/v2/azure-functions/mesh/tags/list" | ||
| _MESH_IMAGE_REPO = "mcr.microsoft.com/azure-functions/mesh" | ||
| _OS_TYPE = "bookworm" if sys.version_info.minor < 14 else "noble" | ||
| _MESH_IMAGE_URL = ( | ||
| f"https://mcr.microsoft.com/v2/azure-functions/{_OS_TYPE}/" | ||
| "flexconsumption/tags/list" | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Will this also work for 3.14 in the future? |
||
| ) | ||
| _MESH_IMAGE_REPO = f"mcr.microsoft.com/azure-functions/{_OS_TYPE}/flexconsumption" | ||
| _FUNC_GITHUB_ZIP = "https://github.com/Azure/azure-functions-python-library" \ | ||
| "/archive/refs/heads/dev.zip" | ||
| _FUNC_FILE_NAME = "azure-functions-python-library-dev" | ||
|
|
@@ -73,24 +77,16 @@ def assign_container(self, env: Dict[str, str] = {}): | |
| env["WEBSITE_SITE_NAME"] = self._uuid | ||
| env["WEBSITE_HOSTNAME"] = f"{self._uuid}.azurewebsites.com" | ||
|
|
||
| # Debug: Print SCM_RUN_FROM_PACKAGE value | ||
| scm_package = env.get("SCM_RUN_FROM_PACKAGE", "NOT_SET") | ||
| print(f"🔍 DEBUG: SCM_RUN_FROM_PACKAGE in env: {scm_package}") | ||
|
|
||
| # Wait for the container to be ready | ||
| max_retries = 60 | ||
| max_retries = 10 | ||
| for i in range(max_retries): | ||
| try: | ||
| ping_req = requests.Request(method="GET", url=f"{url}/admin/host/ping") | ||
| ping_response = self.send_request(ping_req) | ||
| if ping_response.ok: | ||
| print(f"🔍 DEBUG: Container ready after {i + 1} attempts") | ||
| break | ||
| else: | ||
| print("🔍 DEBUG: Ping attempt {i+1}/60 failed with status " | ||
| f"{ping_response.status_code}") | ||
| except Exception as e: | ||
| print(f"🔍 DEBUG: Ping attempt {i + 1}/60 failed with exception: {e}") | ||
| pass | ||
| time.sleep(1) | ||
| else: | ||
| raise RuntimeError(f'Container {self._uuid} did not become ready in time') | ||
|
|
@@ -125,16 +121,9 @@ def send_request( | |
| prepped = session.prepare_request(req) | ||
| prepped.headers['Content-Type'] = 'application/json' | ||
|
|
||
| # Try to generate a proper JWT token first | ||
| try: | ||
| jwt_token = self._generate_jwt_token() | ||
| # Use JWT token for newer Azure Functions host versions | ||
| prepped.headers['Authorization'] = f'Bearer {jwt_token}' | ||
| except ImportError: | ||
| # Fall back to the old SWT token format if jwt library is not available | ||
| swt_token = self._get_site_restricted_token() | ||
| prepped.headers['x-ms-site-restricted-token'] = swt_token | ||
| prepped.headers['Authorization'] = f'Bearer {swt_token}' | ||
| # For flex consumption, use JWT Bearer token | ||
| jwt_token = self._generate_jwt_token() | ||
| prepped.headers['Authorization'] = f'Bearer {jwt_token}' | ||
|
|
||
| # Add additional headers required by Azure Functions host | ||
| prepped.headers['x-site-deployment-id'] = self._uuid | ||
|
|
@@ -215,23 +204,43 @@ def _download_extensions() -> str: | |
| def spawn_container(self, | ||
| image: str, | ||
| env: Dict[str, str] = {}) -> int: | ||
| """Create a docker container and record its port. Create a docker | ||
| container according to the image name. Return the port of container. | ||
| """ | ||
| # Construct environment variables and start the docker container | ||
| worker_path = os.path.join(PROJECT_ROOT, 'azure_functions_worker') | ||
|
|
||
| # TODO: Mount library in docker container | ||
| # self._download_azure_functions() | ||
|
|
||
| # Download python extension base package | ||
| ext_folder = self._download_extensions() | ||
| """Create a docker container and record its port.""" | ||
| if not os.getenv('_DUMMY_CONT_KEY'): | ||
| os.environ['_DUMMY_CONT_KEY'] = "Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==" | ||
| worker_name = 'azure_functions_worker' \ | ||
| if sys.version_info.minor < 13 else 'proxy_worker' | ||
|
|
||
| worker_path = os.path.join(PROJECT_ROOT, worker_name) | ||
| container_worker_path = ( | ||
| f"/azure-functions-host/workers/python/{self._py_version}/" | ||
| "LINUX/X64/azure_functions_worker" | ||
| f"LINUX/X64/{worker_name}" | ||
| ) | ||
|
|
||
| # For Python 3.13+, also mount the runtime libraries | ||
| runtime_v2_path = None | ||
| runtime_v1_path = None | ||
| container_runtime_v2_path = None | ||
| container_runtime_v1_path = None | ||
|
|
||
| if sys.version_info.minor >= 13: | ||
| repo_root = os.path.dirname(PROJECT_ROOT) | ||
| runtime_v2_path = os.path.join( | ||
| repo_root, 'runtimes', 'v2', 'azure_functions_runtime' | ||
| ) | ||
| runtime_v1_path = os.path.join( | ||
| repo_root, 'runtimes', 'v1', 'azure_functions_runtime_v1' | ||
| ) | ||
| container_runtime_v2_path = ( | ||
| f"/azure-functions-host/workers/python/{self._py_version}/" | ||
| "LINUX/X64/azure_functions_runtime" | ||
| ) | ||
| container_runtime_v1_path = ( | ||
| f"/azure-functions-host/workers/python/{self._py_version}/" | ||
| "LINUX/X64/azure_functions_runtime_v1" | ||
| ) | ||
|
|
||
| ext_folder = self._download_extensions() | ||
|
|
||
| base_ext_container_path = ( | ||
| f"/azure-functions-host/workers/python/{self._py_version}/" | ||
| "LINUX/X64/azurefunctions/extensions/base" | ||
|
|
@@ -248,13 +257,25 @@ def spawn_container(self, | |
| run_cmd.extend(["--cap-add", "SYS_ADMIN"]) | ||
| run_cmd.extend(["--device", "/dev/fuse"]) | ||
| run_cmd.extend(["-e", f"CONTAINER_NAME={self._uuid}"]) | ||
| run_cmd.extend(["-e", | ||
| f"CONTAINER_ENCRYPTION_KEY={os.getenv('_DUMMY_CONT_KEY')}"]) | ||
| encryption_key = os.getenv('_DUMMY_CONT_KEY') | ||
| full_key_bytes = base64.b64decode(encryption_key.encode()) | ||
| aes_key_bytes = full_key_bytes[:32] | ||
| aes_key_base64 = base64.b64encode(aes_key_bytes).decode() | ||
| run_cmd.extend(["-e", f"CONTAINER_ENCRYPTION_KEY={aes_key_base64}"]) | ||
| run_cmd.extend(["-e", "WEBSITE_PLACEHOLDER_MODE=1"]) | ||
| # Add required environment variables for JWT issuer validation | ||
| run_cmd.extend(["-e", f"WEBSITE_SITE_NAME={self._uuid}"]) | ||
| run_cmd.extend(["-e", "WEBSITE_SKU=Dynamic"]) | ||
| run_cmd.extend(["-v", f'{worker_path}:{container_worker_path}']) | ||
|
|
||
| # Mount runtime libraries for Python 3.13+ | ||
| if runtime_v2_path and runtime_v1_path: | ||
| run_cmd.extend([ | ||
| "-v", f'{runtime_v2_path}:{container_runtime_v2_path}' | ||
| ]) | ||
| run_cmd.extend([ | ||
| "-v", f'{runtime_v1_path}:{container_runtime_v1_path}' | ||
| ]) | ||
|
|
||
| run_cmd.extend(["-v", | ||
| f'{base_ext_local_path}:{base_ext_container_path}']) | ||
|
|
||
|
|
@@ -316,103 +337,72 @@ def safe_kill_container(self) -> bool: | |
|
|
||
| @classmethod | ||
| def _get_site_restricted_token(cls) -> str: | ||
| """Get the header value which can be used by x-ms-site-restricted-token | ||
| which expires in one day. | ||
| """ | ||
| # For compatibility with older Azure Functions host versions, | ||
| # try the old SWT format first | ||
| """Get SWT token for site-restricted authentication.""" | ||
| exp_ns = int((time.time() + 24 * 60 * 60) * 1000000000) | ||
| token = cls._encrypt_context(os.getenv('_DUMMY_CONT_KEY'), f'exp={exp_ns}') | ||
| return token | ||
|
|
||
| def _generate_jwt_token(self) -> str: | ||
| """Generate a proper JWT token for newer Azure Functions host versions.""" | ||
| """Generate JWT token for Flex consumption authentication.""" | ||
| try: | ||
| import jwt | ||
| except ImportError: | ||
| # Fall back to SWT format if JWT library not available | ||
| return self._get_site_restricted_token() | ||
|
|
||
| # JWT payload matching Azure Functions host expectations | ||
| exp_time = int(time.time()) + (24 * 60 * 60) # 24 hours from now | ||
| except ImportError as e: | ||
| raise RuntimeError("PyJWT library required. Install with: pip install pyjwt") from e | ||
|
|
||
| # Use the site name consistently for issuer and audience validation | ||
| exp_time = int(time.time()) + (24 * 60 * 60) | ||
| iat_time = int(time.time()) | ||
| site_name = self._uuid | ||
| container_name = self._uuid | ||
|
|
||
| # According to Azure Functions host analysis, use site-specific issuer format | ||
| # This matches the ValidIssuers array in ScriptJwtBearerExtensions.cs | ||
| issuer = f"https://{site_name}.azurewebsites.net" | ||
|
|
||
| payload = { | ||
| 'exp': exp_time, | ||
| 'iat': int(time.time()), | ||
| # Use site-specific issuer format that matches ValidIssuers in the host | ||
| 'iat': iat_time, | ||
| 'nbf': iat_time, | ||
| 'iss': issuer, | ||
| # For Linux Consumption in placeholder mode, audience is the container name | ||
| 'aud': container_name | ||
| 'aud': site_name, | ||
| 'sub': site_name, | ||
| } | ||
|
|
||
| # Use the same encryption key for JWT signing | ||
| key = base64.b64decode(os.getenv('_DUMMY_CONT_KEY').encode()) | ||
| encryption_key_str = os.getenv('_DUMMY_CONT_KEY') | ||
| if not encryption_key_str: | ||
| raise RuntimeError("_DUMMY_CONT_KEY environment variable not set") | ||
|
|
||
| # Generate JWT token using HMAC SHA256 (matches Azure Functions host) | ||
| key_bytes = base64.b64decode(encryption_key_str.encode()) | ||
| key = key_bytes[:32] | ||
| jwt_token = jwt.encode(payload, key, algorithm='HS256') | ||
| return jwt_token | ||
|
|
||
| @classmethod | ||
| def _get_site_encrypted_context(cls, | ||
| site_name: str, | ||
| env: Dict[str, str]) -> str: | ||
| """Get the encrypted context for placeholder mode specialization""" | ||
| # Ensure WEBSITE_SITE_NAME is set to simulate production mode | ||
| def _get_site_encrypted_context(cls, site_name: str, env: Dict[str, str]) -> str: | ||
| """Get encrypted specialization context.""" | ||
| env["WEBSITE_SITE_NAME"] = site_name | ||
|
|
||
| ctx = { | ||
| "SiteId": 1, | ||
| "SiteName": site_name, | ||
| "Environment": env | ||
| } | ||
|
|
||
| ctx = {"SiteId": 1, "SiteName": site_name, "Environment": env} | ||
| json_ctx = json.dumps(ctx) | ||
|
|
||
| encrypted = cls._encrypt_context(os.getenv('_DUMMY_CONT_KEY'), json_ctx) | ||
| return encrypted | ||
|
|
||
| @classmethod | ||
| def _encrypt_context(cls, encryption_key: str, plain_text: str) -> str: | ||
| """Encrypt plain text context into an encrypted message which can | ||
| be accepted by the host | ||
| """ | ||
| # Decode the encryption key | ||
| """Encrypt context for specialization.""" | ||
| encryption_key_bytes = base64.b64decode(encryption_key.encode()) | ||
| aes_key = encryption_key_bytes[:32] | ||
|
|
||
| # Pad the plaintext to be a multiple of the AES block size | ||
| padder = padding.PKCS7(algorithms.AES.block_size).padder() | ||
| plain_text_bytes = padder.update(plain_text.encode()) + padder.finalize() | ||
|
|
||
| # Initialization vector (IV) (fixed value for simplicity) | ||
| iv_bytes = '0123456789abcedf'.encode() | ||
|
|
||
| # Create AES cipher with CBC mode | ||
| cipher = Cipher(algorithms.AES(encryption_key_bytes), | ||
| modes.CBC(iv_bytes), backend=default_backend()) | ||
|
|
||
| # Perform encryption | ||
| cipher = Cipher(algorithms.AES(aes_key), modes.CBC(iv_bytes), backend=default_backend()) | ||
| encryptor = cipher.encryptor() | ||
| encrypted_bytes = encryptor.update(plain_text_bytes) + encryptor.finalize() | ||
|
|
||
| # Compute SHA256 hash of the encryption key | ||
| digest = hashes.Hash(hashes.SHA256(), backend=default_backend()) | ||
| digest.update(encryption_key_bytes) | ||
| key_sha256 = digest.finalize() | ||
|
|
||
| # Encode IV, encrypted message, and SHA256 hash in base64 | ||
| iv_base64 = base64.b64encode(iv_bytes).decode() | ||
| encrypted_base64 = base64.b64encode(encrypted_bytes).decode() | ||
|
|
||
| digest = hashes.Hash(hashes.SHA256(), backend=default_backend()) | ||
| digest.update(aes_key) | ||
| key_sha256 = digest.finalize() | ||
| key_sha256_base64 = base64.b64encode(key_sha256).decode() | ||
|
|
||
| # Return the final result | ||
| return f'{iv_base64}.{encrypted_base64}.{key_sha256_base64}' | ||
|
|
||
| def __enter__(self): | ||
|
|
||
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could we keep this assert but change the regex so it works for all python versions? For example, both proxy and afw log "Finished prioritize_customer_dependencies" - we could check for this log instead