Now we will create wafer maps of the using the die analysis we triggered in the die analysis notebook. Make sure all the analyses we triggered are finished (i.e. make sure the last cell in that notebook has finished running)!
As before, make sure you have the following environment variables set or added to a .env
file:
GDSFACTORY_HUB_API_URL="https://{org}.gdsfactoryhub.com"
GDSFACTORY_HUB_QUERY_URL="https://query.{org}.gdsfactoryhub.com"
GDSFACTORY_HUB_KEY="<your-gdsfactoryplus-api-key>"
project_id = f"cutback-{getpass.getuser()}"
client = gfh.create_client_from_env(project_id=project_id)
api = client.api()
query = client.query()
Lets Define the Upper and Lower Spec limits for Known Good Die (KGD).
Lets find a wafer pkey for this project, so that we can trigger the die analysis on it.
wafer_pks = [w["pk"] for w in query.wafers().execute().data]
output = aggregate_die_analyses.run(
wafer_pkey=wafer_pks[0],
die_function_id="cutback_loss",
output_key="component_loss",
min_output=0.0,
max_output=0.115,
)
with gfh.suppress_api_error():
result = api.upload_function(
function_id="aggregate_die_analyses",
target_model="wafer",
file=gfh.get_module_path(aggregate_die_analyses),
test_target_model_pk=wafer_pks[0],
test_kwargs={
"die_function_id": "cutback_loss",
"output_key": "component_loss",
"min_output": 0.0,
"max_output": 0.115,
},
)
# Run analysis on all wafers
task_ids = []
for wafer_pk in wafer_pks:
print(f"Starting analysis for wafer: {wafer_pk}")
task_id = api.start_analysis(
analysis_id=f"wafer_analysis_{wafer_pk}",
function_id="aggregate_die_analyses",
target_model="wafer",
target_model_pk=wafer_pk,
)
task_ids.append(task_id)