Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
freem
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Openai/69596681-3f7c-8006-b9fd-01d9d8191763
(section)
Add languages
Page
Discussion
English
Read
Edit
Edit source
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
Edit source
View history
General
What links here
Related changes
Special pages
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Assistant: python === <syntaxhighlight lang="python"># ============================================================================= === ANF_REFERENCE_IMPLEMENTATION.py === === Single-file, fail-closed, deterministic reference interpreter === === Implements the Typed Small-Step Operational Semantics exactly as specified. === === No training. No optimization. No heuristics. No recovery paths. === === Any deviation => INVALID => COLLAPSE. === === ============================================================================= === import hashlib import json import math import sys from typing import Dict, Any, Tuple, List === ============================================================================= === === GLOBAL TYPES (ENFORCED BY RUNTIME ASSERTIONS) === === ============================================================================= === BOOL = int # {0,1} VERDICT_STAND = 0 VERDICT_COLLAPSE = 1 STATUS_PASS = "PASS" STATUS_FAIL = "FAIL" STATUS_SKIP = "SKIP" STATUS_INVALID = "INVALID" === ============================================================================= === === HARD FAIL-CLOSED PRIMITIVES === === ============================================================================= === def abort(msg: str): raise RuntimeError("INVALID: " + msg) def sha256_bytes(b: bytes) -> str: return hashlib.sha256(b).hexdigest() def assert_true(cond: bool, msg: str): if not cond: abort(msg) === ============================================================================= === === CANONICAL SERIALIZATION (BLOCK 0) === === ============================================================================= === def canonical_serialize(obj: Any) -> bytes: try: s = json.dumps(obj, sort_keys=True, separators=(",", ":"), ensure_ascii=True) except Exception as e: abort("Canonical serialization failed") return s.encode("utf-8") === ============================================================================= === === AXIOMS (BLOCK 1) === === ============================================================================= === def check_axioms(spec: Dict[str, Any]) -> BOOL: required = [ "A0_empirical_authority", "A1_no_synthetic_data", "A2_structure_frozen", "A3_binary_irreversible", "A4_monotone_falsification", "A5_no_optimization", "A6_no_hidden_state", "A7_reproducibility" ] for k in required: assert_true(k in spec and spec[k] is True, f"Axiom violation: {k}") return 1 === ============================================================================= === === TYPES + DATA VALIDITY (BLOCKS 2,9) === === ============================================================================= === def check_dataset(D: Dict[str, Any]) -> BOOL: assert_true("data" in D and isinstance(D["data"], list) and len(D["data"]) > 0, "Dataset empty or malformed") for d in D["data"]: assert_true("y" in d and "Sigma" in d and "meta" in d, "Datum missing fields") assert_true("calibration_hash" in d["meta"], "Missing calibration hash") assert_true("provenance_hash" in d["meta"], "Missing provenance hash") assert_true(d["meta"].get("synthetic", False) is False, "Synthetic data forbidden") return 1 === ============================================================================= === === MODEL MAP (BLOCK 3) === === ============================================================================= === def prediction_operator(theta: List[float], baseline: List[float]) -> List[float]: # Fully frozen deterministic placeholder. # Any change here changes MethodHash => INVALID. return [theta[i] + baseline[i] for i in range(len(theta))] === ============================================================================= === === RESIDUALS + STRUCTURAL VIOLATION (BLOCKS 4,5) === === ============================================================================= === def residuals(D, theta, baseline): r = [] for i, d in enumerate(D["data"]): pred = prediction_operator(theta, baseline)[i] r_i = d["y"] - pred r.append(r_i) return r def structural_violation(r: List[float], tau: float) -> BOOL: for ri in r: if abs(ri) > tau: return 1 return 0 === ============================================================================= === === FEASIBILITY + IDENTIFIABILITY (BLOCKS 6,7) === === ============================================================================= === def feasible(theta_space: List[List[float]], D, baseline, tau) -> BOOL: for theta in theta_space: r = residuals(D, theta, baseline) if structural_violation(r, tau) == 0: return 1 return 0 def identifiable(theta_space: List[List[float]]) -> Tuple[int, List[int]]: # Minimal deterministic rank proxy: dimension of theta k = len(theta_space[0]) if k <= 0: return 0, [] return k, list(range(k)) === ============================================================================= === === LIKELIHOOD (BLOCK 8) === === ============================================================================= === def likelihood(r: List[float], epsilon: float, nu: float) -> float: # Deterministic heavy-tail mixture, no fitting. val = 1.0 for ri in r: gauss = math.exp(-0.5 '' ri '' ri) student = (1 + (ri * ri) / nu) ** (-(nu + 1) / 2) val ''= (1 - epsilon) '' gauss + epsilon * student return val === ============================================================================= === === EVIDENCE + INTEGRATION (BLOCKS 10,11) === === ============================================================================= === def evidence(theta_space, D, baseline, epsilon, nu) -> float: Z = 0.0 for theta in theta_space: r = residuals(D, theta, baseline) Z += likelihood(r, epsilon, nu) return Z === ============================================================================= === === MONTE CARLO ROBUSTNESS (BLOCK 13) === === ============================================================================= === def monte_carlo(D, theta_space, baseline, tau, epsilon, nu, N, f_c) -> BOOL: failures = 0 for i in range(N): # Deterministic perturbation: sign flip on ith datum Dp = json.loads(json.dumps(D)) idx = i % len(Dp["data"]) Dp["data"][idx]["y"] *= -1 Z = evidence(theta_space, Dp, baseline, epsilon, nu) if Z == 0.0: failures += 1 if failures / (i + 1) >= f_c: return 1 return 0 === ============================================================================= === === PRECISION SCALING (BLOCK 14) === === ============================================================================= === def precision_scaling(D, theta_space, baseline, tau) -> BOOL: for p in range(1, 6): tau_p = tau / (2 ** p) for theta in theta_space: r = residuals(D, theta, baseline) if structural_violation(r, tau_p): return 1 return 0 === ============================================================================= === === FAILURE ATTRIBUTION GRAPH (BLOCK 15) === === ============================================================================= === def attribution(flags: Dict[str, BOOL]) -> Dict[str, Any]: precedence = ["ValidityFail","V_struct","FeasibleFail", "EvidenceFailure","PrecisionFailure"] trigger = None for k in precedence: if flags.get(k,0) == 1: trigger = k break return {"trigger": trigger, "flags": flags} === ============================================================================= === === AUDIT (BLOCK 19) === === ============================================================================= === def audit_bundle(method_hash, data_hash, artifacts, verdict): bundle = { "MethodHash": method_hash, "DataHash": data_hash, "Artifacts": artifacts, "Verdict": verdict } return sha256_bytes(canonical_serialize(bundle)) === ============================================================================= === === MAIN ANF INTERPRETER === === ============================================================================= === def run_ANF(spec, D): artifacts = {} flags = {} verdict = None # BLOCK 0 spec_bytes = canonical_serialize(spec) method_hash = sha256_bytes(spec_bytes) assert_true(method_hash == spec["embedded_hash"], "MethodHash mismatch") artifacts["B0"] = STATUS_PASS # BLOCK 1 check_axioms(spec) artifacts["B1"] = STATUS_PASS # BLOCK 2 + 9 check_dataset(D) artifacts["B9"] = STATUS_PASS # BLOCK 4 + 5 theta_space = spec["theta_space"] baseline = spec["baseline"] tau = spec["tau_struct"] r = residuals(D, theta_space[0], baseline) if structural_violation(r, tau): flags["V_struct"] = 1 verdict = VERDICT_COLLAPSE goto_attr = True else: flags["V_struct"] = 0 goto_attr = False artifacts["B5"] = STATUS_PASS if not goto_attr else STATUS_FAIL # BLOCK 6 if not goto_attr: if not feasible(theta_space, D, baseline, tau): flags["FeasibleFail"] = 1 verdict = VERDICT_COLLAPSE goto_attr = True else: flags["FeasibleFail"] = 0 artifacts["B6"] = STATUS_PASS if not goto_attr else STATUS_FAIL # BLOCK 7 if not goto_attr: k_min, _ = identifiable(theta_space) if k_min == 0: flags["FeasibleFail"] = 1 verdict = VERDICT_COLLAPSE goto_attr = True artifacts["B7"] = STATUS_PASS if not goto_attr else STATUS_FAIL # BLOCK 8,10,11 if not goto_attr: epsilon = spec["epsilon"] nu = spec["nu"] Z = evidence(theta_space, D, baseline, epsilon, nu) if Z < spec["Lambda"]: flags["EvidenceFailure"] = 1 verdict = VERDICT_COLLAPSE goto_attr = True else: flags["EvidenceFailure"] = 0 artifacts["B10"] = STATUS_PASS if not goto_attr else STATUS_FAIL # BLOCK 13 if not goto_attr: if monte_carlo(D, theta_space, baseline, tau, epsilon, nu, spec["N_MC"], spec["f_c"]): flags["PrecisionFailure"] = 1 verdict = VERDICT_COLLAPSE goto_attr = True artifacts["B13"] = STATUS_PASS if not goto_attr else STATUS_FAIL # BLOCK 14 if not goto_attr: if precision_scaling(D, theta_space, baseline, tau): flags["PrecisionFailure"] = 1 verdict = VERDICT_COLLAPSE goto_attr = True artifacts["B14"] = STATUS_PASS if not goto_attr else STATUS_FAIL if verdict is None: verdict = VERDICT_STAND # BLOCK 15 attr = attribution(flags) artifacts["B15"] = attr # BLOCK 19 data_hash = sha256_bytes(canonical_serialize(D)) audit = audit_bundle(method_hash, data_hash, artifacts, verdict) return { "Verdict": verdict, "Artifacts": artifacts, "Attribution": attr, "AuditHash": audit } === ============================================================================= === === END OF FILE === === ============================================================================= === </syntaxhighlight>
Summary:
Please note that all contributions to freem are considered to be released under the Creative Commons Attribution-ShareAlike 4.0 (see
Freem:Copyrights
for details). If you do not want your writing to be edited mercilessly and redistributed at will, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource.
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)