3
0
Fork 0
mirror of https://github.com/Z3Prover/z3 synced 2026-02-26 18:15:37 +00:00

Param tuning updates (#8137)

* parameter eval order

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* parameter eval order

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* parameter eval order

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* parameter eval order

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* parameter eval order

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* parameter eval order

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* parameter eval order

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* Bump github/codeql-action from 3 to 4 (#7971)

Bumps [github/codeql-action](https://github.com/github/codeql-action) from 3 to 4.
- [Release notes](https://github.com/github/codeql-action/releases)
- [Changelog](https://github.com/github/codeql-action/blob/main/CHANGELOG.md)
- [Commits](https://github.com/github/codeql-action/compare/v3...v4)

---
updated-dependencies:
- dependency-name: github/codeql-action
  dependency-version: '4'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Update Z3_mk_datatype_sort API to accept array of sort parameters and add Z3_mk_polymorphic_datatype (#7966)

* Initial plan

* Update Z3_mk_datatype_sort API to accept array of parameters

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Update language bindings for parametric datatype sort API

- Python: Updated DatatypeSort() to accept optional params list
- OCaml: Added mk_sort_ref_p/mk_sort_ref_ps for parametric datatypes
- .NET: Added MkDatatypeSortRef() methods with optional params
- Java: Added mkDatatypeSortRef() methods with optional params
- All changes maintain backward compatibility

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Fix .NET build error - rename params to parameters

The 'params' keyword is reserved in C#, causing compilation errors.
Renamed parameter from 'params' to 'parameters' in MkDatatypeSortRef methods.

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Add unit test for parametric datatypes

Added test_parametric_datatype.cpp to demonstrate polymorphic datatypes.
The test creates two concrete instantiations of a generic pair concept:
- pair_int_real with fields (first:Int, second:Real)
- pair_real_int with fields (first:Real, second:Int)
Then verifies that accessors work correctly and type checking is enforced.

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Implement polymorphic type variables support for parametric datatypes

- Modified datatype_decl_plugin to allow type variables as parameters
- Updated mk_datatype_decl to extract type variables from field sorts
- Type variables are collected in order of first appearance
- Revised unit test to use Z3_mk_type_variable for polymorphic datatypes
- Test creates pair datatype with type variables alpha and beta
- Successfully instantiates with concrete types (pair Int Real) and (pair Real Int)
- Verifies accessor types match and equality terms are well-typed
- All tests pass

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Add Z3_mk_polymorphic_datatype API and refactor datatype creation

- Added new API Z3_mk_polymorphic_datatype to z3_api.h
- Renamed static mk_datatype_decl to api_datatype_decl in api_datatype.cpp
- Modified api_datatype_decl to accept explicit type parameters
- Updated all callers to use renamed function
- Added test_polymorphic_datatype_api demonstrating new API usage
- Both tests pass successfully

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Remove type variable collection logic from constructors

Removed the logic for collecting type variables from field sorts based on constructors.

* Update comments on parameter handling in api_datatype.cpp

Clarify usage of parameters in API documentation.

* Fix OCaml build error - use list instead of array for mk_datatype_sort

Changed mk_sort_ref to pass empty list [] instead of empty array [||].
Changed mk_sort_ref_p to pass params list directly instead of converting to array.
Z3native.mk_datatype_sort expects a list, not an array.

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Add polymorphic datatype example to C++ examples

Added polymorphic_datatype_example() demonstrating:
- Creating type variables alpha and beta with Z3_mk_type_variable
- Defining parametric Pair datatype with fields of type alpha and beta
- Instantiating with concrete types (Pair Int Real) and (Pair Real Int)
- Getting constructors and accessors from instantiated datatypes
- Creating constants and expressions using the polymorphic types
- Verifying type correctness with equality (= (first p1) (second p2))

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>
Co-authored-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* trim parametric datatype test

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* restore single cell

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* restore the method behavior

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* setting up python tuning experiment, not done

* Add finite_set_value_factory for creating finite set values in model generation (#7981)

* Initial plan

* Add finite_set_value_factory implementation

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Remove unused dl_decl_plugin variable and include

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Update copyright and add TODOs in finite_set_value_factory

Updated copyright information and added TODO comments for handling in finite_set_value_factory methods.

* Update copyright information in finite_set_value_factory.h

Updated copyright year from 2006 to 2025.

* Implement finite_set_value_factory using array_util to create singleton sets

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Simplify empty set creation in finite_set_value_factory

Refactor finite_set_value_factory to simplify empty set handling and remove array-specific logic.

* Change family ID for finite_set_value_factory

* Fix build error by restoring array_decl_plugin include and implementation

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Update finite_set_value_factory.h

* Add SASSERT for finite set check in factory

Added assertion to check if the sort is a finite set.

* Rename member variable from m_util to u

* Refactor finite_set_value_factory for value handling

* Use register_value instead of direct set insertion

Replaced direct insertion into set with register_value calls.

* Update finite_set_value_factory.cpp

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>
Co-authored-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* Revert "Add finite_set_value_factory for creating finite set values in model …" (#7985)

This reverts commit 05ffc0a77b.

* Update arith_rewriter.cpp

fix memory leak introduced by update to ensure determinism

* update pythonnn prototyping experiment, need to add a couple more things

* add explicit constructors for nightly mac build failure

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* build fixes

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* fixes

* fix some more things but now it hangs

* change multithread to multiprocess seems to have resolved current deadlock

* fix some bugs, it seems to run now

* fix logic about checking clauses individually, and add proof prefix clause selection (naively) via the OnClause hook

* disable manylinux until segfault is resolved

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* add the  "noexcept" keyword to value_score=(value_score&&) declaration

* expose a status flag for clauses but every single one is being coded as an assumption...

* Add a fast-path to _coerce_exprs. (#7995)

When the inputs are already the same sort, we can skip most of the
coercion logic and just return.

Currently, `_coerce_exprs` is by far the most expensive part of
building up many common Z3 ASTs, so this fast-path is a substantial
speedup for many use-cases.

* Bump actions/setup-node from 5 to 6 (#7994)

Bumps [actions/setup-node](https://github.com/actions/setup-node) from 5 to 6.
- [Release notes](https://github.com/actions/setup-node/releases)
- [Commits](https://github.com/actions/setup-node/compare/v5...v6)

---
updated-dependencies:
- dependency-name: actions/setup-node
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Enabling Control Flow Guard (CFG) by default for MSVC on Windows, with options to disable CFG. (#7988)

* Enabling Control Flow Guard by default for MSVC on Windows, with options to disable it.

* Fix configuration error for non-MSVC compilers.

* Reviewed and updated configuration for Python build and added comment for CFG.

* try exponential delay in grobner

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* throttle grobner method more actively

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* enable always add all coeffs in nlsat

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* disable centos build until resolved

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* update centos version

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* Add missing mkLastIndexOf method and CharSort case to Java API (#8002)

* Initial plan

* Add mkLastIndexOf method and CharSort support to Java API

- Added mkLastIndexOf method to Context.java for extracting last index of sub-string
- Added Z3_CHAR_SORT case to Sort.java's create() method switch statement
- Added test file to verify both fixes work correctly

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Fix author field in test file

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Delete examples/java/TestJavaAPICompleteness.java

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>
Co-authored-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* Bump actions/download-artifact from 5 to 6 (#7999)

Bumps [actions/download-artifact](https://github.com/actions/download-artifact) from 5 to 6.
- [Release notes](https://github.com/actions/download-artifact/releases)
- [Commits](https://github.com/actions/download-artifact/compare/v5...v6)

---
updated-dependencies:
- dependency-name: actions/download-artifact
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Bump actions/upload-artifact from 4 to 5 (#7998)

Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from 4 to 5.
- [Release notes](https://github.com/actions/upload-artifact/releases)
- [Commits](https://github.com/actions/upload-artifact/compare/v4...v5)

---
updated-dependencies:
- dependency-name: actions/upload-artifact
  dependency-version: '5'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* initial parameter probe thread setup in C++

* fix build break introduced when adding support for polymorphic datatypes

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* renemable Centos AMD nightly

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* more param tuning setup

* fix C++ example and add polymorphic interface for C++

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* update release notes

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* bump version for release

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* setting up the param probe solvers and mutation generator

* adding the learned clauses from the internalizer

* fix some things for clause replay

* score the param probes, but i can't figure out how to access the relevant solver statistics fields from the statistics obj

* set up pattern to notify batch manager so worker threads can update their params according
ly

* add a getter for solver stats. it compiles but still everything is untested

* bugfix

* updates to param tuning

* remove the getter for solver statistics since we're getting the vals directly from the context

* disable nuget

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* change logic NRA->ALL in log_lemma

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* merge

* patch fix for default manager construction so it can be used to create the param tuning context without segfault

* add tests showing shortcomings of factorization

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* still debugging threading issues where we can't create nested param tuners or it spins infinitely. added flag for this. but now there is segfault on the probe_ctx.check() call

* Add missing string replace operations to Java API (#8011)

* Initial plan

* Add C API and Java bindings for str.replace_all, str.replace_re, str.replace_all_re

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Add test for new Java string replace operations

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Remove author field from test file header

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Delete examples/java/StringReplaceTest.java

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>
Co-authored-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* make param tuning singlethreaded to resolve segfault when spawning subprocesses ffor nested ctx checks

* check propagate ineqs setting before applying simplifier

* comment out parameter check

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* add some toggle-able params to smt_parallel_params.pyg for doing the param tuning experiments on QF_RDL. set up this logic in the smt_parallel files

* add bash scripts to run param experiments on an QF_RDL example to get datapoints

* fix bug about param protocol iteration only happening once, and add new user param to toggle for only running param tuning thread without parallel solving (just to test if it's finding good settings)

* add results of exhaustive param testing for QF_RDL_abz5_1200

* fix infinite loop in update function

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* Add check that argument of Z3_is_algebraic_number is_expr (#8027)

To make sure that the `to_expr` cast is safe.

Signed-off-by: Josh Berdine <josh@berdine.net>

* Add Z3_fpa_is_numeral to the API (#8026)

This is analogous to Z3_fpa_is_numeral_nan, Z3_fpa_is_numeral_inf, etc. and
can be needed to check that inputs are valid before calling those functions.

Signed-off-by: Josh Berdine <josh@berdine.net>

* Make rcf is_rational and is_rational_function operations handle zero (#8025)

The representation of the zero rcf numeral is nullptr, and the is_rational
and is_rational_function operations are not expecting to be called with
nullptr. But there isn't a way to test for that in the API, other than
checking if Z3_rcf_num_to_string returns "0".

This patch adds a couple conditions so that is_rational and
is_rational_function operations handle zero. Maybe this isn't the desired
change. For instance, the is_zero operation could instead be exposed in the
API and preconditions added to the relevant operations.

Signed-off-by: Josh Berdine <josh@berdine.net>

* add back statistics to smt-parallel

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* update package lock

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* strengthen filter for unknown by checking relevancy of parents #8022

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* fix #8024

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* Fix NuGet package missing Microsoft.Z3.dll due to inverted replace() logic (#8029)

* Initial plan

* Fix NuGet packaging and add GitHub Actions workflow

- Fixed critical bug in mk_nuget_task.py replace() function
- Created comprehensive GitHub Actions workflow for building NuGet packages

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Add documentation and improve code comments

- Added comprehensive README for NuGet build workflow
- Added detailed comments explaining the replace() function fix
- Verified all Python syntax is correct

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* small bugfix in searchtree?

* Delete sweep_results_QF_RDL_abz5_1200.csv

* fix escape quotes

* Bump actions/upload-artifact from 4 to 5 (#8034)

Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from 4 to 5.
- [Release notes](https://github.com/actions/upload-artifact/releases)
- [Commits](https://github.com/actions/upload-artifact/compare/v4...v5)

---
updated-dependencies:
- dependency-name: actions/upload-artifact
  dependency-version: '5'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Bump actions/setup-python from 5 to 6 (#8033)

Bumps [actions/setup-python](https://github.com/actions/setup-python) from 5 to 6.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](https://github.com/actions/setup-python/compare/v5...v6)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Bump actions/download-artifact from 4 to 6 (#8032)

Bumps [actions/download-artifact](https://github.com/actions/download-artifact) from 4 to 6.
- [Release notes](https://github.com/actions/download-artifact/releases)
- [Commits](https://github.com/actions/download-artifact/compare/v4...v6)

---
updated-dependencies:
- dependency-name: actions/download-artifact
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* factor out coi, use polynomial elaboration for nlsat solver (#8039)

* factor out coi, use polynomial elaboration for nlsat solver

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* remove unused functionality

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

---------

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* check cancelation in invariant checker

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* mk_util.py: fix --gprof option (#8040)

The addition of -fomit-frame-pointer was missing a space (which broke
the command line), but also this option should be added only if -pg is
*not* given, as they are incompatible. So, just remove this line to fix
the --gprof flag in configure.

Also, this option is implied by any level of `-O`, so there is no need
to pass it explicitly in most cases. It could be added to debug,
non-profile builds, but I'm not sure that's useful.

* unsound lemma

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* better state

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* t

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* t

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* t

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* t

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* t

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* t

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* t

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* t

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* t

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* t

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* t

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* t

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* remove unused method

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* add coefficients from the elim_vanishing to m_todo

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* use indexed root expressions id add_zero_assumption

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* log for smtrat

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* improve log_lemma

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* handle the case with no roots in add_zero_assumption

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* improve logging

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* remve add_zero_assumption from pcs()

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* remove unused code

* refactoring

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* restart projection when found a non-trivial nullified polynomial, and remove is_square_free

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* optionally call add_zero_assumption on a vanishing discriminant

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* disable add_zero_disc(disc) by default

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* remove the exit statement

* remove the debug print

* Bump actions/checkout from 5 to 6 (#8043)

Bumps [actions/checkout](https://github.com/actions/checkout) from 5 to 6.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v5...v6)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* parameter correct order experiment

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* try reordering before analyzing bounds

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* use edit distance for simplified error messaging on wrong trace tags

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* apply gcd test also before saturation

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* Return bool instead of int from Z3_rcf_interval (#8046)

In the underlying realclosure implementation, the interval operations for
{`lower`,`upper`}`_is_`{`inf`,`open`} return `bool` results. Currently these
are cast to `int` when surfacing them to the API. This patch keeps them at
type `bool` through to `Z3_rcf_interval`.

Signed-off-by: Josh Berdine <josh@berdine.net>

* Return sign from Z3_fpa_get_numeral_sign as bool instead of int (#8047)

The underlying `mpf_manager::sgn` function returns a `bool`, and functions
such as `Z3_mk_fpa_numeral_int_uint` take the sign as a `bool`.

Signed-off-by: Josh Berdine <josh@berdine.net>

* Return bool instead of int in extra_API for Z3_open_log (#8048)

The C declaration returns `bool`.

Signed-off-by: Josh Berdine <josh@berdine.net>

* update doc test string

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* open_log returns bool

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* update java API code to work with boolean pointers

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* remove unused

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* port to BoolPtr

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* fix type for BoolPtr

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* port dotnet to use bool sorts from API

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* fix warnings in nra_solver

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* fix warnings in nla_pp

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* fix dotnet build errors

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* python type fixes

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* fix build warnings

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* use c_bool instead of c_int for sign

* remove references to set_has_size

* fix second byref to bool

* remove set cardinality operators from array theory. Make final-check use priority levels

Issue #7502 shows that running nlsat eagerly during final check can block quantifier instantiation.
To give space for quantifier instances we introduce two levels for final check such that nlsat is only applied in the second and final level.

* insert theory only once

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* refine givup conditions

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* Fix _in vs _out def_API param for Z3_solver_get_levels (#8050)

Signed-off-by: Josh Berdine <josh@berdine.net>

* remove deprecated set_has_size

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* remove deprecated set_has_size

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* remove deprecated set_has_size

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* remove deprecated set_has_size

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* fix #8055

* fix #8054

inherit denominators when evaluating polynomials

* remove unused *_signed_project() methods

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* Disable C++98 compatibility warnings for Clang builds (#8060)

* Initial plan

* Disable C++98 compatibility warnings for Clang to fix vcpkg build freeze

Add -Wno-c++98-compat and -Wno-c++98-compat-pedantic flags to prevent
excessive warning output when building with clang-cl or when -Weverything
is enabled. These warnings are not useful for Z3 since it requires C++20.

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* fix the build

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* allow parsing declared arrays without requiring explicit select

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* disable preprocessing only after formulas are internalized

* bring in nikolaj's preprocessing patch from master

* don't unfold recursive defs if there is an uninterpreted subterm, #7671

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* remove stale experimental code #8063

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* Implement Z3_optimize_translate for context translation (#8072)

* Initial plan

* Implement Z3_optimize_translate functionality

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Fix compilation errors and add tests for optimize translate

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Revert changes to opt_solver.cpp as requested

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Search tree core resolution optimization (#8066)

* Add cube tree optimization about resolving cores recursively up the path, to prune. Also integrate asms into the tree so they're not tracked separately (#7960)

* draft attempt at optimizing cube tree with resolvents. have not tested/ran yet

* adding comments

* fix bug about needing to bubble resolvent upwards to highest ancestor

* fix bug where we need to cover the whole resolvent in the path when bubbling up

* clean up comments

* close entire tree when sibling resolvent is empty

* integrate asms directly into cube tree, remove separate tracking

* try to fix bug about redundant resolutions, merging close and try_resolve_upwards into once function

* separate the logic again to avoid mutual recursion

* Refactor search tree closure and resolution logic

Refactor close_with_core to simplify logic and remove unnecessary parameters. Update sibling resolvent computation and try_resolve_upwards for clarity.

* apply formatting

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* Refactor close_with_core to use current node in lambda

* Fix formatting issues in search_tree.h

* fix build issues

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* Update smt_parallel.cpp

* Change loop variable type in unsat core processing

* Change method to retrieve unsat core from root

---------

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>
Co-authored-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* Typescript typedef and doc fixes (#8073)

* Fix Typescript typedef to allow `new Context`

* fix init() tsdoc example using nonexistent sat import

* Revert "Typescript typedef and doc fixes (#8073)" (#8077)

This reverts commit 6cfbcd19df.

* Typescript typedef and doc fixes take 2 (#8078)

* Fix Typescript typedef to allow `new Context`

* fix init() tsdoc example using nonexistent sat import

* Fix DEL character (0x7F) not being escaped in string literals (#8080)

* Initial plan

* Fix DEL character encoding in string literals

Change condition from `ch >= 128` to `ch >= 127` to include the DEL
character (U+007F, 127) in escaped output. This ensures that the
non-printable DEL control character is properly escaped as \u{7f}
instead of being printed directly.

Also add test cases for DEL and other control characters.

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Add GitHub Actions workflow to publish JavaScript/TypeScript API documentation (#8084)

* Initial plan

* Add GitHub Actions workflow to build and publish documentation

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Refine documentation workflow to use mk_api_doc.py and install doxygen

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Clarify documentation generation step name

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Bump actions/cache from 4 to 5 (#8081)

Bumps [actions/cache](https://github.com/actions/cache) from 4 to 5.
- [Release notes](https://github.com/actions/cache/releases)
- [Changelog](https://github.com/actions/cache/blob/main/RELEASES.md)
- [Commits](https://github.com/actions/cache/compare/v4...v5)

---
updated-dependencies:
- dependency-name: actions/cache
  dependency-version: '5'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Bump actions/download-artifact from 6 to 7 (#8082)

Bumps [actions/download-artifact](https://github.com/actions/download-artifact) from 6 to 7.
- [Release notes](https://github.com/actions/download-artifact/releases)
- [Commits](https://github.com/actions/download-artifact/compare/v6...v7)

---
updated-dependencies:
- dependency-name: actions/download-artifact
  dependency-version: '7'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Build native Z3 for x64 before generating docs (#8086)

* Initial plan

* Build Z3 for ubuntu x64 and invoke JS build scripts in docs workflow

- Modified docs.yml to build Z3 natively for ubuntu-latest x64 before documentation generation
- Removed default working directory to allow explicit working-directory per step
- Added step to build Z3 using python scripts/mk_make.py
- Kept emscripten setup as required for wasm build
- Modified mk_api_doc.py to invoke npm run build:ts and npm run build:wasm as described in src/api/js/README.md
- mk_api_doc.py now runs the full build process (TypeScript + wasm) before generating documentation

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Source emsdk environment before generating documentation

- Added source command to ensure emscripten tools are in PATH when mk_api_doc.py runs npm scripts
- This ensures emcc and other emscripten tools are available for npm run build:wasm

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Revert changes to mk_api_doc.py

- Removed npm run build:ts and npm run build:wasm calls from mk_api_doc.py
- Reverted to only calling npm run docs as before

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Bump actions/upload-artifact from 5 to 6 (#8083)

Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from 5 to 6.
- [Release notes](https://github.com/actions/upload-artifact/releases)
- [Commits](https://github.com/actions/upload-artifact/compare/v5...v6)

---
updated-dependencies:
- dependency-name: actions/upload-artifact
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* build the ts bindings

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* build the ts bindings

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* Update docs.yml

* Update docs.yml

* Add defaults for job run working directory

* Update docs.yml

* BLD: Add CMake option to build Python bindings without rebuilding libz3 (redux) (#8088)

* Add CMake option to build only Python bindings without rebuilding libz3

Introduce Z3_BUILD_LIBZ3_CORE option (default ON) to control whether libz3 is built.
When set to OFF with Z3_BUILD_PYTHON_BINDINGS=ON, only Python bindings are built
using a pre-installed libz3 library. This is useful for package managers like
conda-forge to avoid rebuilding libz3 for each Python version.

Changes:
- Add Z3_BUILD_LIBZ3_CORE option in src/CMakeLists.txt
- When OFF, find and use pre-installed libz3 as imported target
- Update Python bindings CMakeLists.txt to handle both built and imported libz3
- Add documentation in README-CMake.md with usage examples

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Fix CMake export issues when building only Python bindings

Conditionally export Z3_EXPORTED_TARGETS only when Z3_BUILD_LIBZ3_CORE=ON
to avoid errors when building Python bindings without building libz3.

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Disable executable and test builds when not building libz3 core

When Z3_BUILD_LIBZ3_CORE=OFF, automatically disable Z3_BUILD_EXECUTABLE
and Z3_BUILD_TEST_EXECUTABLES to avoid build/install errors.

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* only build src/ folder if Z3_BUILD_LIBZ3_CORE is TRUE

* move z3 python bindings to main CMake

* move more logic to main CMakeLists.txt

* move Z3_API_HEADER_FILES_TO_SCAN to main CMakeLists.txt

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* merge

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* Fix docs.yml workflow: remove conflicting native build step (#8091)

* Initial plan

* Fix docs.yml workflow to properly build and deploy documentation

- Remove `no-install: true` from setup-emsdk step to properly install emscripten
- Remove `working-directory: src/api/js` from setup-emsdk step (not needed)
- Remove manual emsdk install/activate/source commands from Build wasm step
- Remove redundant native Z3 build step that conflicted with wasm build
- Simplify Generate Documentation step by removing redundant source command

The main issue was that the native Z3 build was creating a Makefile in the build/
directory before the wasm build ran. The wasm build script then saw the Makefile
existed and skipped the emconfigure step, but the native Makefile doesn't have a
libz3.a target, causing the build to fail. Removing the native build allows the
wasm build to properly configure its own build with emscripten.

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* fix #8092

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* use new arithmetic solver for AUFLIA, fixes #8090

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* Fix docs.yml workflow: resolve WASM/native library conflict in documentation generation (#8093)

* Initial plan

* Fix docs.yml: Build native Z3 Python bindings before WASM to avoid library conflicts

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Update docs.yml

* Update docs.yml

* Add working directory for wasm build step

* Simplify CI workflow by removing emscripten steps

Removed unnecessary steps for emscripten setup and TypeScript/WASM build in the CI workflow.

* Deploy docs to z3prover.github.io organization pages (#8094)

* Initial plan

* Deploy docs to z3prover.github.io organization pages

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Update docs.yml

* Update publish directory for documentation deployment

* Modify docs.yml for deployment settings

Updated the GitHub Actions workflow for documentation deployment, changing the publish directory and removing the push trigger.

* fix indentation

* docs with ml bindings

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* Fix docs.yml workflow: update actions to v4 (#8095)

* Initial plan

* Fix docs.yml workflow: update GitHub Actions to valid versions

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* update doc

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* updated with env ocaml

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* include paramters

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* enable js

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* Modify docs.yml to generate JS documentation

Updated documentation generation script to include JavaScript output.

* Update docs.yml

* try adding wasm as separate step

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* fix build dir

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* set build be configurable by env

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* Fix Z3BUILD environment variable in docs workflow

* Refactor documentation workflow to simplify installation

Remove redundant command for installing Python package.

* make build directory configurable

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* set build directory

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* na

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* Fix docs.yml workflow: specify working directory for npm commands (#8098)

* Initial plan

* Fix docs.yml build by adding working-directory to npm steps

Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>

* Update docs.yml

* fix #8097

* flight test copilot generated slop?

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* indent

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* naming convention

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* update to macos-latest

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* next flight test

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* remove flight test

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* Some changes to improve LIA performance (#8101)

* add user params

* inprocessing flag

* playing around with clause sharing with some arith constraints (complicated version commented out)

* collect shared clauses inside share units after pop to base level (might help NIA)

* dont collect clauses twice

* dont pop to base level when sharing units, manual filter

* clean up code

---------

Co-authored-by: Ilana Shapiro <ilanashapiro@Mac.localdomain>

* fix #8102

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* fix #8076

remove unsound "optimization" for correction sets. It misses feasible solutions

* assert entry_invariant only when all changes are done

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>

* fix #8099 (again)

Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>

* fixes to finite domain arrays

- relevancy could be off and array solver doesn't compensate, #7544
- enforce equalities across store for small domain axioms #8065

* reduce rdl tunable params

* new RDL scoring

* remove pop to base lvl for tuner

* scaling m_max_prefix_conflicts

* try to mutate pairs

* go back to single param flip version

* new scoring

* change some scoring strategies, add LIA and NIA param tuning

* fix big about updt_params automatically spawning new parallel objects in param generator checks

---------

Signed-off-by: Lev Nachmanson <levnach@hotmail.com>
Signed-off-by: dependabot[bot] <support@github.com>
Signed-off-by: Nikolaj Bjorner <nbjorner@microsoft.com>
Signed-off-by: Josh Berdine <josh@berdine.net>
Co-authored-by: Lev Nachmanson <levnach@hotmail.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Copilot <198982749+Copilot@users.noreply.github.com>
Co-authored-by: NikolajBjorner <3085284+NikolajBjorner@users.noreply.github.com>
Co-authored-by: Nikolaj Bjorner <nbjorner@microsoft.com>
Co-authored-by: Nelson Elhage <nelhage@nelhage.com>
Co-authored-by: hwisungi <hwisungi@users.noreply.github.com>
Co-authored-by: Josh Berdine <josh@berdine.net>
Co-authored-by: Guido Martínez <mtzguido@gmail.com>
Co-authored-by: Chris Cowan <agentme49@gmail.com>
Co-authored-by: h-vetinari <h.vetinari@gmx.com>
Co-authored-by: Ilana Shapiro <ilanashapiro@Mac.localdomain>
Co-authored-by: Ilana Shapiro <ilanashapiro@Ilanas-MBP.localdomain>
Co-authored-by: Ilana Shapiro <ilanashapiro@Ilanas-MBP.lan1>
Co-authored-by: Ilana Shapiro <ilanashapiro@Ilanas-MacBook-Pro.local>
This commit is contained in:
Ilana Shapiro 2026-01-09 14:14:44 -08:00 committed by GitHub
parent abbbc3c530
commit c2102c8dfb
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
189 changed files with 2843 additions and 6259 deletions

87
.github/workflows/NUGET_BUILD_README.md vendored Normal file
View file

@ -0,0 +1,87 @@
# NuGet Package Build Workflow
This document describes the GitHub Actions workflow for building Z3 NuGet packages.
## Overview
The NuGet build workflow (`.github/workflows/nuget-build.yml`) creates Microsoft.Z3 NuGet packages for distribution. It builds Z3 for all supported platforms and assembles them into NuGet packages.
## Triggering the Workflow
The workflow can be triggered in two ways:
### 1. Manual Trigger
You can manually trigger the workflow from the GitHub Actions tab:
1. Go to the "Actions" tab in the repository
2. Select "Build NuGet Package" workflow
3. Click "Run workflow"
4. Enter the version number (e.g., `4.15.5`)
5. Click "Run workflow"
### 2. Tag-based Trigger
The workflow automatically runs when you push a tag with the `z3-` prefix:
```bash
git tag z3-4.15.5
git push origin z3-4.15.5
```
## Workflow Structure
The workflow consists of multiple jobs:
### Build Jobs
1. **build-windows-x64**: Builds Windows x64 binaries with .NET support
2. **build-windows-x86**: Builds Windows x86 binaries with .NET support
3. **build-windows-arm64**: Builds Windows ARM64 binaries with .NET support
4. **build-ubuntu**: Builds Linux x64 binaries with .NET support
5. **build-macos-x64**: Builds macOS x64 binaries with .NET support
6. **build-macos-arm64**: Builds macOS ARM64 binaries with .NET support
### Package Jobs
1. **package-nuget-x64**: Creates the main NuGet package (Microsoft.Z3.nupkg) with x64, ARM64, Linux, and macOS support
2. **package-nuget-x86**: Creates the x86 NuGet package (Microsoft.Z3.x86.nupkg)
## Output
The workflow produces two NuGet packages as artifacts:
- `Microsoft.Z3.{version}.nupkg` and `Microsoft.Z3.{version}.snupkg` (x64 + multi-platform)
- `Microsoft.Z3.x86.{version}.nupkg` and `Microsoft.Z3.x86.{version}.snupkg` (x86 only)
These can be downloaded from the workflow run's artifacts section.
## Key Files
- `.github/workflows/nuget-build.yml`: The workflow definition
- `scripts/mk_nuget_task.py`: Script that assembles the NuGet package from build artifacts
- `scripts/mk_win_dist.py`: Script for building Windows x86/x64 distributions
- `scripts/mk_win_dist_cmake.py`: Script for building Windows ARM64 distributions
- `scripts/mk_unix_dist.py`: Script for building Linux and macOS distributions
## Bug Fix
This workflow includes a fix for a critical bug in `mk_nuget_task.py` where the `replace()` function had incorrect logic that would fail to copy files when the destination already existed. The fix ensures that Microsoft.Z3.dll and related files are always properly included in the NuGet package under `lib/netstandard2.0/`.
## Development
To test changes to the NuGet packaging locally, you can:
1. Build the platform-specific binaries using the appropriate build scripts
2. Collect the resulting ZIP files in a directory
3. Run `mk_nuget_task.py` to assemble the package:
```bash
python scripts/mk_nuget_task.py <packages_dir> <version> <repo_url> <branch> <commit> <source_dir> [symbols] [x86]
```
4. Use the NuGet CLI to pack the package:
```bash
nuget pack out/Microsoft.Z3.sym.nuspec -OutputDirectory . -Verbosity detailed -Symbols -SymbolPackageFormat snupkg -BasePath out
```

View file

@ -22,7 +22,7 @@ jobs:
runs-on: windows-latest runs-on: windows-latest
steps: steps:
- name: Checkout code - name: Checkout code
uses: actions/checkout@v5 uses: actions/checkout@v6
- name: Add msbuild to PATH - name: Add msbuild to PATH
uses: microsoft/setup-msbuild@v2 uses: microsoft/setup-msbuild@v2
- run: | - run: |

View file

@ -21,7 +21,7 @@ jobs:
steps: steps:
- name: Checkout code - name: Checkout code
uses: actions/checkout@v5 uses: actions/checkout@v6
- name: Configure CMake and build - name: Configure CMake and build
run: | run: |
@ -32,7 +32,7 @@ jobs:
tar -cvf z3-build-${{ matrix.android-abi }}.tar *.jar *.so tar -cvf z3-build-${{ matrix.android-abi }}.tar *.jar *.so
- name: Archive production artifacts - name: Archive production artifacts
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: android-build-${{ matrix.android-abi }} name: android-build-${{ matrix.android-abi }}
path: build/z3-build-${{ matrix.android-abi }}.tar path: build/z3-build-${{ matrix.android-abi }}.tar

10
.github/workflows/ask.lock.yml generated vendored
View file

@ -569,7 +569,7 @@ jobs:
output: ${{ steps.collect_output.outputs.output }} output: ${{ steps.collect_output.outputs.output }}
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v5 uses: actions/checkout@v6
- name: Setup agent output - name: Setup agent output
id: setup_agent_output id: setup_agent_output
uses: actions/github-script@v8 uses: actions/github-script@v8
@ -1223,7 +1223,7 @@ jobs:
.write(); .write();
- name: Upload agentic run info - name: Upload agentic run info
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: aw_info.json name: aw_info.json
path: /tmp/aw_info.json path: /tmp/aw_info.json
@ -1329,7 +1329,7 @@ jobs:
echo "" >> $GITHUB_STEP_SUMMARY echo "" >> $GITHUB_STEP_SUMMARY
- name: Upload agentic output file - name: Upload agentic output file
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: safe_output.jsonl name: safe_output.jsonl
path: ${{ env.GITHUB_AW_SAFE_OUTPUTS }} path: ${{ env.GITHUB_AW_SAFE_OUTPUTS }}
@ -2277,7 +2277,7 @@ jobs:
await main(); await main();
- name: Upload sanitized agent output - name: Upload sanitized agent output
if: always() && env.GITHUB_AW_AGENT_OUTPUT if: always() && env.GITHUB_AW_AGENT_OUTPUT
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: agent_output.json name: agent_output.json
path: ${{ env.GITHUB_AW_AGENT_OUTPUT }} path: ${{ env.GITHUB_AW_AGENT_OUTPUT }}
@ -2814,7 +2814,7 @@ jobs:
main(); main();
- name: Upload agent logs - name: Upload agent logs
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: question-answering-researcher.log name: question-answering-researcher.log
path: /tmp/question-answering-researcher.log path: /tmp/question-answering-researcher.log

12
.github/workflows/ci-doctor.lock.yml generated vendored
View file

@ -36,10 +36,10 @@ jobs:
output: ${{ steps.collect_output.outputs.output }} output: ${{ steps.collect_output.outputs.output }}
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v5 uses: actions/checkout@v6
# Cache configuration from frontmatter processed below # Cache configuration from frontmatter processed below
- name: Cache (investigation-memory-${{ github.repository }}) - name: Cache (investigation-memory-${{ github.repository }})
uses: actions/cache@v4 uses: actions/cache@v5
with: with:
key: investigation-memory-${{ github.repository }} key: investigation-memory-${{ github.repository }}
path: | path: |
@ -808,7 +808,7 @@ jobs:
.write(); .write();
- name: Upload agentic run info - name: Upload agentic run info
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: aw_info.json name: aw_info.json
path: /tmp/aw_info.json path: /tmp/aw_info.json
@ -911,7 +911,7 @@ jobs:
echo "" >> $GITHUB_STEP_SUMMARY echo "" >> $GITHUB_STEP_SUMMARY
- name: Upload agentic output file - name: Upload agentic output file
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: safe_output.jsonl name: safe_output.jsonl
path: ${{ env.GITHUB_AW_SAFE_OUTPUTS }} path: ${{ env.GITHUB_AW_SAFE_OUTPUTS }}
@ -1859,7 +1859,7 @@ jobs:
await main(); await main();
- name: Upload sanitized agent output - name: Upload sanitized agent output
if: always() && env.GITHUB_AW_AGENT_OUTPUT if: always() && env.GITHUB_AW_AGENT_OUTPUT
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: agent_output.json name: agent_output.json
path: ${{ env.GITHUB_AW_AGENT_OUTPUT }} path: ${{ env.GITHUB_AW_AGENT_OUTPUT }}
@ -2396,7 +2396,7 @@ jobs:
main(); main();
- name: Upload agent logs - name: Upload agent logs
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: ci-failure-doctor.log name: ci-failure-doctor.log
path: /tmp/ci-failure-doctor.log path: /tmp/ci-failure-doctor.log

View file

@ -20,7 +20,7 @@ jobs:
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v5 uses: actions/checkout@v6
- name: Initialize CodeQL - name: Initialize CodeQL
uses: github/codeql-action/init@v4 uses: github/codeql-action/init@v4

View file

@ -19,7 +19,7 @@ jobs:
COV_DETAILS_PATH: ${{github.workspace}}/cov-details COV_DETAILS_PATH: ${{github.workspace}}/cov-details
steps: steps:
- uses: actions/checkout@v5 - uses: actions/checkout@v6
- name: Setup - name: Setup
run: | run: |
@ -89,13 +89,13 @@ jobs:
id: date id: date
run: echo "date=$(date +'%Y-%m-%d')" >> $GITHUB_OUTPUT run: echo "date=$(date +'%Y-%m-%d')" >> $GITHUB_OUTPUT
- uses: actions/upload-artifact@v5 - uses: actions/upload-artifact@v6
with: with:
name: coverage-${{steps.date.outputs.date}} name: coverage-${{steps.date.outputs.date}}
path: ${{github.workspace}}/coverage.html path: ${{github.workspace}}/coverage.html
retention-days: 4 retention-days: 4
- uses: actions/upload-artifact@v5 - uses: actions/upload-artifact@v6
with: with:
name: coverage-details-${{steps.date.outputs.date}} name: coverage-details-${{steps.date.outputs.date}}
path: ${{env.COV_DETAILS_PATH}} path: ${{env.COV_DETAILS_PATH}}

View file

@ -19,7 +19,7 @@ jobs:
steps: steps:
- name: Checkout code - name: Checkout code
uses: actions/checkout@v5 uses: actions/checkout@v6
- name: Install cross build tools - name: Install cross build tools
run: apt update && apt install -y ninja-build cmake python3 g++-11-${{ matrix.arch }}-linux-gnu run: apt update && apt install -y ninja-build cmake python3 g++-11-${{ matrix.arch }}-linux-gnu

View file

@ -25,7 +25,7 @@ jobs:
output: ${{ steps.collect_output.outputs.output }} output: ${{ steps.collect_output.outputs.output }}
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v5 uses: actions/checkout@v6
- name: Configure Git credentials - name: Configure Git credentials
run: | run: |
git config --global user.email "github-actions[bot]@users.noreply.github.com" git config --global user.email "github-actions[bot]@users.noreply.github.com"
@ -747,7 +747,7 @@ jobs:
.write(); .write();
- name: Upload agentic run info - name: Upload agentic run info
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: aw_info.json name: aw_info.json
path: /tmp/aw_info.json path: /tmp/aw_info.json
@ -856,7 +856,7 @@ jobs:
echo "" >> $GITHUB_STEP_SUMMARY echo "" >> $GITHUB_STEP_SUMMARY
- name: Upload agentic output file - name: Upload agentic output file
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: safe_output.jsonl name: safe_output.jsonl
path: ${{ env.GITHUB_AW_SAFE_OUTPUTS }} path: ${{ env.GITHUB_AW_SAFE_OUTPUTS }}
@ -1804,7 +1804,7 @@ jobs:
await main(); await main();
- name: Upload sanitized agent output - name: Upload sanitized agent output
if: always() && env.GITHUB_AW_AGENT_OUTPUT if: always() && env.GITHUB_AW_AGENT_OUTPUT
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: agent_output.json name: agent_output.json
path: ${{ env.GITHUB_AW_AGENT_OUTPUT }} path: ${{ env.GITHUB_AW_AGENT_OUTPUT }}
@ -2341,7 +2341,7 @@ jobs:
main(); main();
- name: Upload agent logs - name: Upload agent logs
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: daily-backlog-burner.log name: daily-backlog-burner.log
path: /tmp/daily-backlog-burner.log path: /tmp/daily-backlog-burner.log
@ -2435,7 +2435,7 @@ jobs:
fi fi
- name: Upload git patch - name: Upload git patch
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: aw.patch name: aw.patch
path: /tmp/aw.patch path: /tmp/aw.patch
@ -2946,12 +2946,12 @@ jobs:
steps: steps:
- name: Download patch artifact - name: Download patch artifact
continue-on-error: true continue-on-error: true
uses: actions/download-artifact@v6 uses: actions/download-artifact@v7
with: with:
name: aw.patch name: aw.patch
path: /tmp/ path: /tmp/
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v5 uses: actions/checkout@v6
with: with:
fetch-depth: 0 fetch-depth: 0
- name: Configure Git credentials - name: Configure Git credentials

View file

@ -25,7 +25,7 @@ jobs:
output: ${{ steps.collect_output.outputs.output }} output: ${{ steps.collect_output.outputs.output }}
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v5 uses: actions/checkout@v6
- id: check_build_steps_file - id: check_build_steps_file
name: Check if action.yml exists name: Check if action.yml exists
run: | run: |
@ -822,7 +822,7 @@ jobs:
.write(); .write();
- name: Upload agentic run info - name: Upload agentic run info
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: aw_info.json name: aw_info.json
path: /tmp/aw_info.json path: /tmp/aw_info.json
@ -931,7 +931,7 @@ jobs:
echo "" >> $GITHUB_STEP_SUMMARY echo "" >> $GITHUB_STEP_SUMMARY
- name: Upload agentic output file - name: Upload agentic output file
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: safe_output.jsonl name: safe_output.jsonl
path: ${{ env.GITHUB_AW_SAFE_OUTPUTS }} path: ${{ env.GITHUB_AW_SAFE_OUTPUTS }}
@ -1879,7 +1879,7 @@ jobs:
await main(); await main();
- name: Upload sanitized agent output - name: Upload sanitized agent output
if: always() && env.GITHUB_AW_AGENT_OUTPUT if: always() && env.GITHUB_AW_AGENT_OUTPUT
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: agent_output.json name: agent_output.json
path: ${{ env.GITHUB_AW_AGENT_OUTPUT }} path: ${{ env.GITHUB_AW_AGENT_OUTPUT }}
@ -2416,7 +2416,7 @@ jobs:
main(); main();
- name: Upload agent logs - name: Upload agent logs
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: daily-perf-improver.log name: daily-perf-improver.log
path: /tmp/daily-perf-improver.log path: /tmp/daily-perf-improver.log
@ -2510,7 +2510,7 @@ jobs:
fi fi
- name: Upload git patch - name: Upload git patch
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: aw.patch name: aw.patch
path: /tmp/aw.patch path: /tmp/aw.patch
@ -3021,12 +3021,12 @@ jobs:
steps: steps:
- name: Download patch artifact - name: Download patch artifact
continue-on-error: true continue-on-error: true
uses: actions/download-artifact@v6 uses: actions/download-artifact@v7
with: with:
name: aw.patch name: aw.patch
path: /tmp/ path: /tmp/
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v5 uses: actions/checkout@v6
with: with:
fetch-depth: 0 fetch-depth: 0
- name: Configure Git credentials - name: Configure Git credentials

View file

@ -25,7 +25,7 @@ jobs:
output: ${{ steps.collect_output.outputs.output }} output: ${{ steps.collect_output.outputs.output }}
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v5 uses: actions/checkout@v6
- id: check_coverage_steps_file - id: check_coverage_steps_file
name: Check if action.yml exists name: Check if action.yml exists
run: | run: |
@ -797,7 +797,7 @@ jobs:
.write(); .write();
- name: Upload agentic run info - name: Upload agentic run info
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: aw_info.json name: aw_info.json
path: /tmp/aw_info.json path: /tmp/aw_info.json
@ -906,7 +906,7 @@ jobs:
echo "" >> $GITHUB_STEP_SUMMARY echo "" >> $GITHUB_STEP_SUMMARY
- name: Upload agentic output file - name: Upload agentic output file
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: safe_output.jsonl name: safe_output.jsonl
path: ${{ env.GITHUB_AW_SAFE_OUTPUTS }} path: ${{ env.GITHUB_AW_SAFE_OUTPUTS }}
@ -1854,7 +1854,7 @@ jobs:
await main(); await main();
- name: Upload sanitized agent output - name: Upload sanitized agent output
if: always() && env.GITHUB_AW_AGENT_OUTPUT if: always() && env.GITHUB_AW_AGENT_OUTPUT
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: agent_output.json name: agent_output.json
path: ${{ env.GITHUB_AW_AGENT_OUTPUT }} path: ${{ env.GITHUB_AW_AGENT_OUTPUT }}
@ -2391,7 +2391,7 @@ jobs:
main(); main();
- name: Upload agent logs - name: Upload agent logs
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: daily-test-coverage-improver.log name: daily-test-coverage-improver.log
path: /tmp/daily-test-coverage-improver.log path: /tmp/daily-test-coverage-improver.log
@ -2485,7 +2485,7 @@ jobs:
fi fi
- name: Upload git patch - name: Upload git patch
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: aw.patch name: aw.patch
path: /tmp/aw.patch path: /tmp/aw.patch
@ -2996,12 +2996,12 @@ jobs:
steps: steps:
- name: Download patch artifact - name: Download patch artifact
continue-on-error: true continue-on-error: true
uses: actions/download-artifact@v6 uses: actions/download-artifact@v7
with: with:
name: aw.patch name: aw.patch
path: /tmp/ path: /tmp/
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v5 uses: actions/checkout@v6
with: with:
fetch-depth: 0 fetch-depth: 0
- name: Configure Git credentials - name: Configure Git credentials

109
.github/workflows/docs.yml vendored Normal file
View file

@ -0,0 +1,109 @@
name: Documentation
on:
workflow_dispatch:
permissions:
contents: read
concurrency:
group: "pages"
cancel-in-progress: false
env:
EM_VERSION: 3.1.73
jobs:
build-docs:
name: Build Documentation
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6
- name: Setup node
uses: actions/setup-node@v6
with:
node-version: "lts/*"
# Setup OCaml via action
- uses: ocaml/setup-ocaml@v3
with:
ocaml-compiler: 5
opam-disable-sandboxing: true
- name: Install system dependencies
run: |
sudo apt-get update
sudo apt-get install -y doxygen graphviz python3 python3-pip
sudo apt-get install -y \
bubblewrap m4 libgmp-dev pkg-config
- name: Install required opam packages
run: opam install -y ocamlfind zarith
- name: Build Z3 natively for Python documentation
run: |
eval $(opam env)
echo "CC: $CC"
echo "CXX: $CXX"
echo "OCAMLFIND: $(which ocamlfind)"
echo "OCAMLC: $(which ocamlc)"
echo "OCAMLOPT: $(which ocamlopt)"
echo "OCAML_VERSION: $(ocamlc -version)"
echo "OCAMLLIB: $OCAMLLIB"
mkdir build-x64
python3 scripts/mk_make.py --python --ml --build=build-x64
cd build-x64
make -j$(nproc)
- name: Generate Documentation (from doc directory)
working-directory: doc
run: |
eval $(opam env)
python3 mk_api_doc.py --mld --output-dir=api --z3py-package-path=../build-x64/python/z3 --build=../build-x64
Z3BUILD=build-x64 python3 mk_params_doc.py
mkdir api/html/ml
ocamldoc -html -d api/html/ml -sort -hide Z3 -I $( ocamlfind query zarith ) -I ../build-x64/api/ml ../build-x64/api/ml/z3enums.mli ../build-x64/api/ml/z3.mli
- name: Setup emscripten
uses: mymindstorm/setup-emsdk@v14
with:
no-install: true
version: ${{env.EM_VERSION}}
actions-cache-folder: "emsdk-cache"
- name: Install dependencies
working-directory: src/api/js
run: npm ci
- name: Build TypeScript
working-directory: src/api/js
run: npm run build:ts
- name: Build wasm
working-directory: src/api/js
run: |
emsdk install ${EM_VERSION}
emsdk activate ${EM_VERSION}
source $(dirname $(which emsdk))/emsdk_env.sh
which node
which clang++
npm run build:wasm
- name: Generate JS Documentation (from doc directory)
working-directory: doc
run: |
eval $(opam env)
python3 mk_api_doc.py --js --output-dir=api --mld --z3py-package-path=../build-x64/python/z3 --build=../build-x64
- name: Deploy to z3prover.github.io
uses: peaceiris/actions-gh-pages@v4
with:
deploy_key: ${{ secrets.ACTIONS_DEPLOY_KEY }}
external_repository: Z3Prover/z3prover.github.io
destination_dir: ./api
publish_branch: master
publish_dir: ./doc/api
user_name: github-actions[bot]
user_email: github-actions[bot]@users.noreply.github.com

View file

@ -13,7 +13,7 @@ jobs:
genai-issue-labeller: genai-issue-labeller:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v5 - uses: actions/checkout@v6
- uses: pelikhan/action-genai-issue-labeller@v0 - uses: pelikhan/action-genai-issue-labeller@v0
with: with:
github_token: ${{ secrets.GITHUB_TOKEN }} github_token: ${{ secrets.GITHUB_TOKEN }}

View file

@ -14,7 +14,7 @@ jobs:
BUILD_TYPE: Release BUILD_TYPE: Release
steps: steps:
- name: Checkout Repo - name: Checkout Repo
uses: actions/checkout@v5 uses: actions/checkout@v6
- name: Build - name: Build
run: | run: |

View file

@ -14,7 +14,7 @@ jobs:
BUILD_TYPE: Release BUILD_TYPE: Release
steps: steps:
- name: Checkout Repo - name: Checkout Repo
uses: actions/checkout@v5 uses: actions/checkout@v6
- name: Build - name: Build
run: | run: |

256
.github/workflows/nuget-build.yml vendored Normal file
View file

@ -0,0 +1,256 @@
name: Build NuGet Package
on:
workflow_dispatch:
inputs:
version:
description: 'Version number for the NuGet package (e.g., 4.15.5)'
required: true
default: '4.15.5'
push:
tags:
- 'z3-*'
permissions:
contents: write
jobs:
# Build Windows binaries
build-windows-x64:
runs-on: windows-latest
steps:
- name: Checkout code
uses: actions/checkout@v6
- name: Setup Python
uses: actions/setup-python@v6
with:
python-version: '3.x'
- name: Build Windows x64
shell: cmd
run: |
call "C:\Program Files\Microsoft Visual Studio\2022\Enterprise\VC\Auxiliary\Build\vcvarsall.bat" x64
python scripts\mk_win_dist.py --x64-only --dotnet-key=%GITHUB_WORKSPACE%\resources\z3.snk --assembly-version=${{ github.event.inputs.version || '4.15.5' }} --zip
- name: Upload Windows x64 artifact
uses: actions/upload-artifact@v6
with:
name: windows-x64
path: dist/*.zip
retention-days: 1
build-windows-x86:
runs-on: windows-latest
steps:
- name: Checkout code
uses: actions/checkout@v6
- name: Setup Python
uses: actions/setup-python@v6
with:
python-version: '3.x'
- name: Build Windows x86
shell: cmd
run: |
call "C:\Program Files\Microsoft Visual Studio\2022\Enterprise\VC\Auxiliary\Build\vcvarsall.bat" x86
python scripts\mk_win_dist.py --x86-only --dotnet-key=%GITHUB_WORKSPACE%\resources\z3.snk --assembly-version=${{ github.event.inputs.version || '4.15.5' }} --zip
- name: Upload Windows x86 artifact
uses: actions/upload-artifact@v6
with:
name: windows-x86
path: dist/*.zip
retention-days: 1
build-windows-arm64:
runs-on: windows-latest
steps:
- name: Checkout code
uses: actions/checkout@v6
- name: Setup Python
uses: actions/setup-python@v6
with:
python-version: '3.x'
- name: Build Windows ARM64
shell: cmd
run: |
call "C:\Program Files\Microsoft Visual Studio\2022\Enterprise\VC\Auxiliary\Build\vcvarsall.bat" amd64_arm64
python scripts\mk_win_dist_cmake.py --arm64-only --dotnet-key=%GITHUB_WORKSPACE%\resources\z3.snk --assembly-version=${{ github.event.inputs.version || '4.15.5' }} --zip
- name: Upload Windows ARM64 artifact
uses: actions/upload-artifact@v6
with:
name: windows-arm64
path: build-dist\arm64\dist\*.zip
retention-days: 1
build-ubuntu:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v6
- name: Setup Python
uses: actions/setup-python@v6
with:
python-version: '3.x'
- name: Build Ubuntu
run: python scripts/mk_unix_dist.py --dotnet-key=$GITHUB_WORKSPACE/resources/z3.snk
- name: Upload Ubuntu artifact
uses: actions/upload-artifact@v6
with:
name: ubuntu
path: dist/*.zip
retention-days: 1
build-macos-x64:
runs-on: macos-13
steps:
- name: Checkout code
uses: actions/checkout@v6
- name: Setup Python
uses: actions/setup-python@v6
with:
python-version: '3.x'
- name: Build macOS x64
run: python scripts/mk_unix_dist.py --dotnet-key=$GITHUB_WORKSPACE/resources/z3.snk
- name: Upload macOS x64 artifact
uses: actions/upload-artifact@v6
with:
name: macos-x64
path: dist/*.zip
retention-days: 1
build-macos-arm64:
runs-on: macos-13
steps:
- name: Checkout code
uses: actions/checkout@v6
- name: Setup Python
uses: actions/setup-python@v6
with:
python-version: '3.x'
- name: Build macOS ARM64
run: python scripts/mk_unix_dist.py --dotnet-key=$GITHUB_WORKSPACE/resources/z3.snk --arch=arm64
- name: Upload macOS ARM64 artifact
uses: actions/upload-artifact@v6
with:
name: macos-arm64
path: dist/*.zip
retention-days: 1
# Package NuGet x64 (includes all platforms except x86)
package-nuget-x64:
needs: [build-windows-x64, build-windows-arm64, build-ubuntu, build-macos-x64, build-macos-arm64]
runs-on: windows-latest
steps:
- name: Checkout code
uses: actions/checkout@v6
- name: Setup Python
uses: actions/setup-python@v6
with:
python-version: '3.x'
- name: Download all artifacts
uses: actions/download-artifact@v7
with:
path: packages
- name: List downloaded artifacts
shell: bash
run: find packages -type f
- name: Move artifacts to flat directory
shell: bash
run: |
mkdir -p package-files
find packages -name "*.zip" -exec cp {} package-files/ \;
ls -la package-files/
- name: Setup NuGet
uses: nuget/setup-nuget@v2
with:
nuget-version: 'latest'
- name: Assemble NuGet package
shell: cmd
run: |
cd package-files
python ..\scripts\mk_nuget_task.py . ${{ github.event.inputs.version || '4.15.5' }} https://github.com/Z3Prover/z3 ${{ github.ref_name }} ${{ github.sha }} ${{ github.workspace }} symbols
- name: Pack NuGet package
shell: cmd
run: |
cd package-files
nuget pack out\Microsoft.Z3.sym.nuspec -OutputDirectory . -Verbosity detailed -Symbols -SymbolPackageFormat snupkg -BasePath out
- name: Upload NuGet package
uses: actions/upload-artifact@v6
with:
name: nuget-x64
path: |
package-files/*.nupkg
package-files/*.snupkg
retention-days: 30
# Package NuGet x86
package-nuget-x86:
needs: [build-windows-x86]
runs-on: windows-latest
steps:
- name: Checkout code
uses: actions/checkout@v6
- name: Setup Python
uses: actions/setup-python@v6
with:
python-version: '3.x'
- name: Download x86 artifact
uses: actions/download-artifact@v7
with:
name: windows-x86
path: packages
- name: List downloaded artifacts
shell: bash
run: find packages -type f
- name: Setup NuGet
uses: nuget/setup-nuget@v2
with:
nuget-version: 'latest'
- name: Assemble NuGet package
shell: cmd
run: |
cd packages
python ..\scripts\mk_nuget_task.py . ${{ github.event.inputs.version || '4.15.5' }} https://github.com/Z3Prover/z3 ${{ github.ref_name }} ${{ github.sha }} ${{ github.workspace }} symbols x86
- name: Pack NuGet package
shell: cmd
run: |
cd packages
nuget pack out\Microsoft.Z3.x86.sym.nuspec -OutputDirectory . -Verbosity detailed -Symbols -SymbolPackageFormat snupkg -BasePath out
- name: Upload NuGet package
uses: actions/upload-artifact@v6
with:
name: nuget-x86
path: |
packages/*.nupkg
packages/*.snupkg
retention-days: 30

View file

@ -17,11 +17,11 @@ jobs:
steps: steps:
- name: Checkout code - name: Checkout code
uses: actions/checkout@v5 uses: actions/checkout@v6
# Cache ccache (shared across runs) # Cache ccache (shared across runs)
- name: Cache ccache - name: Cache ccache
uses: actions/cache@v4 uses: actions/cache@v5
with: with:
path: ~/.ccache path: ~/.ccache
key: ${{ runner.os }}-ccache-${{ github.sha }} key: ${{ runner.os }}-ccache-${{ github.sha }}
@ -30,7 +30,7 @@ jobs:
# Cache opam (compiler + packages) # Cache opam (compiler + packages)
- name: Cache opam - name: Cache opam
uses: actions/cache@v4 uses: actions/cache@v5
with: with:
path: ~/.opam path: ~/.opam
key: ${{ runner.os }}-opam-${{ matrix.ocaml-version }}-${{ github.sha }} key: ${{ runner.os }}-opam-${{ matrix.ocaml-version }}-${{ github.sha }}

16
.github/workflows/pr-fix.lock.yml generated vendored
View file

@ -569,7 +569,7 @@ jobs:
output: ${{ steps.collect_output.outputs.output }} output: ${{ steps.collect_output.outputs.output }}
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v5 uses: actions/checkout@v6
- name: Configure Git credentials - name: Configure Git credentials
run: | run: |
git config --global user.email "github-actions[bot]@users.noreply.github.com" git config --global user.email "github-actions[bot]@users.noreply.github.com"
@ -1251,7 +1251,7 @@ jobs:
.write(); .write();
- name: Upload agentic run info - name: Upload agentic run info
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: aw_info.json name: aw_info.json
path: /tmp/aw_info.json path: /tmp/aw_info.json
@ -1360,7 +1360,7 @@ jobs:
echo "" >> $GITHUB_STEP_SUMMARY echo "" >> $GITHUB_STEP_SUMMARY
- name: Upload agentic output file - name: Upload agentic output file
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: safe_output.jsonl name: safe_output.jsonl
path: ${{ env.GITHUB_AW_SAFE_OUTPUTS }} path: ${{ env.GITHUB_AW_SAFE_OUTPUTS }}
@ -2308,7 +2308,7 @@ jobs:
await main(); await main();
- name: Upload sanitized agent output - name: Upload sanitized agent output
if: always() && env.GITHUB_AW_AGENT_OUTPUT if: always() && env.GITHUB_AW_AGENT_OUTPUT
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: agent_output.json name: agent_output.json
path: ${{ env.GITHUB_AW_AGENT_OUTPUT }} path: ${{ env.GITHUB_AW_AGENT_OUTPUT }}
@ -2845,7 +2845,7 @@ jobs:
main(); main();
- name: Upload agent logs - name: Upload agent logs
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: pr-fix.log name: pr-fix.log
path: /tmp/pr-fix.log path: /tmp/pr-fix.log
@ -2939,7 +2939,7 @@ jobs:
fi fi
- name: Upload git patch - name: Upload git patch
if: always() if: always()
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@v6
with: with:
name: aw.patch name: aw.patch
path: /tmp/aw.patch path: /tmp/aw.patch
@ -3371,12 +3371,12 @@ jobs:
steps: steps:
- name: Download patch artifact - name: Download patch artifact
continue-on-error: true continue-on-error: true
uses: actions/download-artifact@v6 uses: actions/download-artifact@v7
with: with:
name: aw.patch name: aw.patch
path: /tmp/ path: /tmp/
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v5 uses: actions/checkout@v6
with: with:
fetch-depth: 0 fetch-depth: 0
- name: Configure Git credentials - name: Configure Git credentials

View file

@ -13,7 +13,7 @@ jobs:
generate-pull-request-description: generate-pull-request-description:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v5 - uses: actions/checkout@v6
- uses: pelikhan/action-genai-pull-request-descriptor@v0 - uses: pelikhan/action-genai-pull-request-descriptor@v0
with: with:
github_token: ${{ secrets.GITHUB_TOKEN }} github_token: ${{ secrets.GITHUB_TOKEN }}

View file

@ -19,7 +19,7 @@ jobs:
steps: steps:
- name: Checkout code - name: Checkout code
uses: actions/checkout@v5 uses: actions/checkout@v6
- name: Setup packages - name: Setup packages
run: sudo apt-get update && sudo apt-get install -y python3-dev python3-pip python3-venv run: sudo apt-get update && sudo apt-get install -y python3-dev python3-pip python3-venv

View file

@ -21,7 +21,7 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@v5 uses: actions/checkout@v6
- name: Setup node - name: Setup node
uses: actions/setup-node@v6 uses: actions/setup-node@v6

View file

@ -21,7 +21,7 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@v5 uses: actions/checkout@v6
- name: Setup node - name: Setup node
uses: actions/setup-node@v6 uses: actions/setup-node@v6

View file

@ -15,7 +15,7 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v5 - uses: actions/checkout@v6
- name: Configure CMake - name: Configure CMake
run: cmake -B ${{github.workspace}}/build -DCMAKE_BUILD_TYPE=${{env.BUILD_TYPE}} run: cmake -B ${{github.workspace}}/build -DCMAKE_BUILD_TYPE=${{env.BUILD_TYPE}}

View file

@ -548,21 +548,93 @@ set(Z3_GENERATED_FILE_EXTRA_DEPENDENCIES
) )
################################################################################ ################################################################################
# Z3 components, library and executables # API header files
################################################################################ ################################################################################
include(${PROJECT_SOURCE_DIR}/cmake/z3_add_component.cmake) # This lists the API header files that are scanned by
include(${PROJECT_SOURCE_DIR}/cmake/z3_append_linker_flag_list_to_target.cmake) # some of the build rules to generate some files needed
add_subdirectory(src) # by the build; needs to come before add_subdirectory(src)
set(Z3_API_HEADER_FILES_TO_SCAN
z3_api.h
z3_ast_containers.h
z3_algebraic.h
z3_polynomial.h
z3_rcf.h
z3_fixedpoint.h
z3_optimization.h
z3_fpa.h
z3_spacer.h
)
set(Z3_FULL_PATH_API_HEADER_FILES_TO_SCAN "")
foreach (header_file ${Z3_API_HEADER_FILES_TO_SCAN})
set(full_path_api_header_file "${CMAKE_CURRENT_SOURCE_DIR}/src/api/${header_file}")
list(APPEND Z3_FULL_PATH_API_HEADER_FILES_TO_SCAN "${full_path_api_header_file}")
if (NOT EXISTS "${full_path_api_header_file}")
message(FATAL_ERROR "API header file \"${full_path_api_header_file}\" does not exist")
endif()
endforeach()
################################################################################ ################################################################################
# Create `Z3Config.cmake` and related files for the build tree so clients can # Create `Z3Config.cmake` and related files for the build tree so clients can
# use Z3 via CMake. # use Z3 via CMake.
################################################################################ ################################################################################
include(CMakePackageConfigHelpers) include(CMakePackageConfigHelpers)
export(EXPORT Z3_EXPORTED_TARGETS
NAMESPACE z3:: option(Z3_BUILD_LIBZ3_CORE "Build the core libz3 library" ON)
FILE "${PROJECT_BINARY_DIR}/Z3Targets.cmake" # Only export targets if we built libz3
) if (Z3_BUILD_LIBZ3_CORE)
################################################################################
# Z3 components, library and executables
################################################################################
include(${PROJECT_SOURCE_DIR}/cmake/z3_add_component.cmake)
include(${PROJECT_SOURCE_DIR}/cmake/z3_append_linker_flag_list_to_target.cmake)
add_subdirectory(src)
export(EXPORT Z3_EXPORTED_TARGETS
NAMESPACE z3::
FILE "${PROJECT_BINARY_DIR}/Z3Targets.cmake"
)
else()
# When not building libz3, we need to find it
message(STATUS "Not building libz3, will look for pre-installed library")
find_library(Z3_LIBRARY NAMES z3 libz3
HINTS ${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_LIBDIR}
PATH_SUFFIXES lib lib64
)
if (NOT Z3_LIBRARY)
message(FATAL_ERROR "Could not find pre-installed libz3. Please ensure libz3 is installed or set Z3_BUILD_LIBZ3_CORE=ON")
endif()
message(STATUS "Found libz3: ${Z3_LIBRARY}")
# Create an imported target for the pre-installed libz3
add_library(libz3 SHARED IMPORTED)
set_target_properties(libz3 PROPERTIES
IMPORTED_LOCATION "${Z3_LIBRARY}"
)
# Set include directories for the imported target
target_include_directories(libz3 INTERFACE
${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_INCLUDEDIR}
)
endif()
################################################################################
# Z3 API bindings
################################################################################
option(Z3_BUILD_PYTHON_BINDINGS "Build Python bindings for Z3" OFF)
if (Z3_BUILD_PYTHON_BINDINGS)
# Validate configuration for Python bindings
if (Z3_BUILD_LIBZ3_CORE)
# Building libz3 together with Python bindings
if (NOT Z3_BUILD_LIBZ3_SHARED)
message(FATAL_ERROR "The python bindings will not work with a static libz3. "
"You either need to disable Z3_BUILD_PYTHON_BINDINGS or enable Z3_BUILD_LIBZ3_SHARED")
endif()
else()
# Using pre-installed libz3 for Python bindings
message(STATUS "Building Python bindings with pre-installed libz3")
endif()
add_subdirectory(src/api/python)
endif()
set(Z3_FIRST_PACKAGE_INCLUDE_DIR "${PROJECT_BINARY_DIR}/src/api") set(Z3_FIRST_PACKAGE_INCLUDE_DIR "${PROJECT_BINARY_DIR}/src/api")
set(Z3_SECOND_PACKAGE_INCLUDE_DIR "${PROJECT_SOURCE_DIR}/src/api") set(Z3_SECOND_PACKAGE_INCLUDE_DIR "${PROJECT_SOURCE_DIR}/src/api")
set(Z3_CXX_PACKAGE_INCLUDE_DIR "${PROJECT_SOURCE_DIR}/src/api/c++") set(Z3_CXX_PACKAGE_INCLUDE_DIR "${PROJECT_SOURCE_DIR}/src/api/c++")
@ -593,12 +665,15 @@ configure_file("${CMAKE_CURRENT_SOURCE_DIR}/z3.pc.cmake.in"
# Create `Z3Config.cmake` and related files for install tree so clients can use # Create `Z3Config.cmake` and related files for install tree so clients can use
# Z3 via CMake. # Z3 via CMake.
################################################################################ ################################################################################
install(EXPORT # Only install targets if we built libz3
Z3_EXPORTED_TARGETS if (Z3_BUILD_LIBZ3_CORE)
FILE "Z3Targets.cmake" install(EXPORT
NAMESPACE z3:: Z3_EXPORTED_TARGETS
DESTINATION "${CMAKE_INSTALL_Z3_CMAKE_PACKAGE_DIR}" FILE "Z3Targets.cmake"
) NAMESPACE z3::
DESTINATION "${CMAKE_INSTALL_Z3_CMAKE_PACKAGE_DIR}"
)
endif()
set(Z3_INSTALL_TREE_CMAKE_CONFIG_FILE "${PROJECT_BINARY_DIR}/cmake/Z3Config.cmake") set(Z3_INSTALL_TREE_CMAKE_CONFIG_FILE "${PROJECT_BINARY_DIR}/cmake/Z3Config.cmake")
set(Z3_FIRST_PACKAGE_INCLUDE_DIR "${CMAKE_INSTALL_INCLUDEDIR}") set(Z3_FIRST_PACKAGE_INCLUDE_DIR "${CMAKE_INSTALL_INCLUDEDIR}")
set(Z3_SECOND_INCLUDE_DIR "") set(Z3_SECOND_INCLUDE_DIR "")

View file

@ -410,9 +410,10 @@ The following useful options can be passed to CMake whilst configuring.
* ``Python3_EXECUTABLE`` - STRING. The python executable to use during the build. * ``Python3_EXECUTABLE`` - STRING. The python executable to use during the build.
* ``Z3_ENABLE_TRACING_FOR_NON_DEBUG`` - BOOL. If set to ``TRUE`` enable tracing in non-debug builds, if set to ``FALSE`` disable tracing in non-debug builds. Note in debug builds tracing is always enabled. * ``Z3_ENABLE_TRACING_FOR_NON_DEBUG`` - BOOL. If set to ``TRUE`` enable tracing in non-debug builds, if set to ``FALSE`` disable tracing in non-debug builds. Note in debug builds tracing is always enabled.
* ``Z3_BUILD_LIBZ3_SHARED`` - BOOL. If set to ``TRUE`` build libz3 as a shared library otherwise build as a static library. * ``Z3_BUILD_LIBZ3_SHARED`` - BOOL. If set to ``TRUE`` build libz3 as a shared library otherwise build as a static library.
* ``Z3_BUILD_LIBZ3_CORE`` - BOOL. If set to ``TRUE`` (default) build the core libz3 library. If set to ``FALSE``, skip building libz3 and look for a pre-installed library instead. This is useful when building only Python bindings on top of an already-installed libz3.
* ``Z3_ENABLE_EXAMPLE_TARGETS`` - BOOL. If set to ``TRUE`` add the build targets for building the API examples. * ``Z3_ENABLE_EXAMPLE_TARGETS`` - BOOL. If set to ``TRUE`` add the build targets for building the API examples.
* ``Z3_USE_LIB_GMP`` - BOOL. If set to ``TRUE`` use the GNU multiple precision library. If set to ``FALSE`` use an internal implementation. * ``Z3_USE_LIB_GMP`` - BOOL. If set to ``TRUE`` use the GNU multiple precision library. If set to ``FALSE`` use an internal implementation.
* ``Z3_BUILD_PYTHON_BINDINGS`` - BOOL. If set to ``TRUE`` then Z3's python bindings will be built. * ``Z3_BUILD_PYTHON_BINDINGS`` - BOOL. If set to ``TRUE`` then Z3's python bindings will be built. When ``Z3_BUILD_LIBZ3_CORE`` is ``FALSE``, this will build only the Python bindings using a pre-installed libz3.
* ``Z3_INSTALL_PYTHON_BINDINGS`` - BOOL. If set to ``TRUE`` and ``Z3_BUILD_PYTHON_BINDINGS`` is ``TRUE`` then running the ``install`` target will install Z3's Python bindings. * ``Z3_INSTALL_PYTHON_BINDINGS`` - BOOL. If set to ``TRUE`` and ``Z3_BUILD_PYTHON_BINDINGS`` is ``TRUE`` then running the ``install`` target will install Z3's Python bindings.
* ``Z3_BUILD_DOTNET_BINDINGS`` - BOOL. If set to ``TRUE`` then Z3's .NET bindings will be built. * ``Z3_BUILD_DOTNET_BINDINGS`` - BOOL. If set to ``TRUE`` then Z3's .NET bindings will be built.
* ``Z3_INSTALL_DOTNET_BINDINGS`` - BOOL. If set to ``TRUE`` and ``Z3_BUILD_DOTNET_BINDINGS`` is ``TRUE`` then running the ``install`` target will install Z3's .NET bindings. * ``Z3_INSTALL_DOTNET_BINDINGS`` - BOOL. If set to ``TRUE`` and ``Z3_BUILD_DOTNET_BINDINGS`` is ``TRUE`` then running the ``install`` target will install Z3's .NET bindings.
@ -464,6 +465,49 @@ cmake -DCMAKE_BUILD_TYPE=Release -DZ3_ENABLE_TRACING_FOR_NON_DEBUG=FALSE ../
Z3 exposes various language bindings for its API. Below are some notes on building Z3 exposes various language bindings for its API. Below are some notes on building
and/or installing these bindings when building Z3 with CMake. and/or installing these bindings when building Z3 with CMake.
### Python bindings
#### Building Python bindings with libz3
The default behavior when ``Z3_BUILD_PYTHON_BINDINGS=ON`` is to build both the libz3 library
and the Python bindings together:
```
mkdir build
cd build
cmake -DZ3_BUILD_PYTHON_BINDINGS=ON -DZ3_BUILD_LIBZ3_SHARED=ON ../
make
```
#### Building only Python bindings (using pre-installed libz3)
For package managers like conda-forge that want to avoid rebuilding libz3 for each Python version,
you can build only the Python bindings by setting ``Z3_BUILD_LIBZ3_CORE=OFF``. This assumes
libz3 is already installed on your system:
```
# First, build and install libz3 (once)
mkdir build-libz3
cd build-libz3
cmake -DZ3_BUILD_LIBZ3_SHARED=ON -DCMAKE_INSTALL_PREFIX=/path/to/prefix ../
make
make install
# Then, build Python bindings for each Python version (quickly, without rebuilding libz3)
cd ..
mkdir build-py310
cd build-py310
cmake -DZ3_BUILD_LIBZ3_CORE=OFF \
-DZ3_BUILD_PYTHON_BINDINGS=ON \
-DCMAKE_INSTALL_PREFIX=/path/to/prefix \
-DPython3_EXECUTABLE=/path/to/python3.10 ../
make
make install
```
This approach significantly reduces build time when packaging for multiple Python versions,
as the expensive libz3 compilation happens only once.
### Java bindings ### Java bindings
The CMake build uses the ``FindJava`` and ``FindJNI`` cmake modules to detect the The CMake build uses the ``FindJava`` and ``FindJNI`` cmake modules to detect the

View file

@ -6,7 +6,13 @@ set(GCC_AND_CLANG_WARNINGS
"-Wall" "-Wall"
) )
set(GCC_ONLY_WARNINGS "") set(GCC_ONLY_WARNINGS "")
set(CLANG_ONLY_WARNINGS "") # Disable C++98 compatibility warnings to prevent excessive warning output
# when building with clang-cl or when -Weverything is enabled.
# These warnings are not useful for Z3 since it requires C++20.
set(CLANG_ONLY_WARNINGS
"-Wno-c++98-compat"
"-Wno-c++98-compat-pedantic"
)
set(MSVC_WARNINGS "/W3") set(MSVC_WARNINGS "/W3")
################################################################################ ################################################################################

View file

@ -9,7 +9,8 @@ import sys
import re import re
import os import os
BUILD_DIR='../build' build_env = dict(os.environ)
BUILD_DIR = '../' + build_env.get('Z3BUILD', 'build')
OUTPUT_DIRECTORY=os.path.join(os.getcwd(), 'api') OUTPUT_DIRECTORY=os.path.join(os.getcwd(), 'api')
def parse_options(): def parse_options():

View file

@ -44,10 +44,20 @@ def classify_package(f, arch):
return None return None
def replace(src, dst): def replace(src, dst):
"""
Replace destination file with source file.
Removes the destination file if it exists, then moves the source file to the destination.
This ensures that the file is always moved, whether or not the destination exists.
Previous buggy implementation only moved when removal failed, causing files to be
deleted but not replaced when the destination already existed.
"""
try: try:
os.remove(dst) os.remove(dst)
except: except:
shutil.move(src, dst) pass
shutil.move(src, dst)
def unpack(packages, symbols, arch): def unpack(packages, symbols, arch):
# unzip files in packages # unzip files in packages

View file

@ -2686,8 +2686,6 @@ def mk_config():
CPPFLAGS = '%s -DZ3DEBUG -D_DEBUG' % CPPFLAGS CPPFLAGS = '%s -DZ3DEBUG -D_DEBUG' % CPPFLAGS
else: else:
CXXFLAGS = '%s -O3' % CXXFLAGS CXXFLAGS = '%s -O3' % CXXFLAGS
if GPROF:
CXXFLAGS += '-fomit-frame-pointer'
CPPFLAGS = '%s -DNDEBUG -D_EXTERNAL_RELEASE' % CPPFLAGS CPPFLAGS = '%s -DNDEBUG -D_EXTERNAL_RELEASE' % CPPFLAGS
if is_CXX_clangpp(): if is_CXX_clangpp():
CXXFLAGS = '%s -Wno-unknown-pragmas -Wno-overloaded-virtual -Wno-unused-value' % CXXFLAGS CXXFLAGS = '%s -Wno-unknown-pragmas -Wno-overloaded-virtual -Wno-unused-value' % CXXFLAGS

View file

@ -14,7 +14,7 @@ stages:
displayName: "Mac Build" displayName: "Mac Build"
timeoutInMinutes: 90 timeoutInMinutes: 90
pool: pool:
vmImage: "macOS-13" vmImage: "macOS-latest"
steps: steps:
- task: PythonScript@0 - task: PythonScript@0
displayName: Build displayName: Build
@ -43,7 +43,7 @@ stages:
- job: MacBuildArm64 - job: MacBuildArm64
displayName: "Mac ARM64 Build" displayName: "Mac ARM64 Build"
pool: pool:
vmImage: "macOS-13" vmImage: "macOS-latest"
steps: steps:
- script: python scripts/mk_unix_dist.py --dotnet-key=$(Build.SourcesDirectory)/resources/z3.snk --arch=arm64 - script: python scripts/mk_unix_dist.py --dotnet-key=$(Build.SourcesDirectory)/resources/z3.snk --arch=arm64
- script: git clone https://github.com/z3prover/z3test z3test - script: git clone https://github.com/z3prover/z3test z3test
@ -196,8 +196,7 @@ stages:
- task: PublishPipelineArtifact@0 - task: PublishPipelineArtifact@0
inputs: inputs:
artifactName: 'ManyLinuxPythonBuildArm64' artifactName: 'ManyLinuxPythonBuildArm64'
targetPath: $(Build.ArtifactStagingDirectory) targetPath: $(Build.ArtifactStagingDirectory)
- template: build-win-signed.yml - template: build-win-signed.yml
parameters: parameters:

View file

@ -558,6 +558,8 @@ def param2java(p):
return "LongPtr" return "LongPtr"
elif param_type(p) == STRING: elif param_type(p) == STRING:
return "StringPtr" return "StringPtr"
elif param_type(p) == BOOL:
return "BoolPtr"
else: else:
print("ERROR: unreachable code") print("ERROR: unreachable code")
assert(False) assert(False)
@ -623,6 +625,7 @@ def mk_java(java_src, java_dir, package_name):
java_native.write(' public static class StringPtr { public String value; }\n') java_native.write(' public static class StringPtr { public String value; }\n')
java_native.write(' public static class ObjArrayPtr { public long[] value; }\n') java_native.write(' public static class ObjArrayPtr { public long[] value; }\n')
java_native.write(' public static class UIntArrayPtr { public int[] value; }\n') java_native.write(' public static class UIntArrayPtr { public int[] value; }\n')
java_native.write(' public static class BoolPtr { public boolean value; }\n')
java_native.write(' public static native void setInternalErrorHandler(long ctx);\n\n') java_native.write(' public static native void setInternalErrorHandler(long ctx);\n\n')
java_native.write(' static {\n') java_native.write(' static {\n')
@ -1086,6 +1089,9 @@ def def_API(name, result, params):
elif ty == INT64: elif ty == INT64:
log_c.write(" I(0);\n") log_c.write(" I(0);\n")
exe_c.write("in.get_int64_addr(%s)" % i) exe_c.write("in.get_int64_addr(%s)" % i)
elif ty == BOOL:
log_c.write(" I(0);\n")
exe_c.write("in.get_bool_addr(%s)" % i)
elif ty == VOID_PTR: elif ty == VOID_PTR:
log_c.write(" P(0);\n") log_c.write(" P(0);\n")
exe_c.write("in.get_obj_addr(%s)" % i) exe_c.write("in.get_obj_addr(%s)" % i)

View file

@ -1,29 +1,3 @@
################################################################################
# API header files
################################################################################
# This lists the API header files that are scanned by
# some of the build rules to generate some files needed
# by the build
set(Z3_API_HEADER_FILES_TO_SCAN
z3_api.h
z3_ast_containers.h
z3_algebraic.h
z3_polynomial.h
z3_rcf.h
z3_fixedpoint.h
z3_optimization.h
z3_fpa.h
z3_spacer.h
)
set(Z3_FULL_PATH_API_HEADER_FILES_TO_SCAN "")
foreach (header_file ${Z3_API_HEADER_FILES_TO_SCAN})
set(full_path_api_header_file "${CMAKE_CURRENT_SOURCE_DIR}/api/${header_file}")
list(APPEND Z3_FULL_PATH_API_HEADER_FILES_TO_SCAN "${full_path_api_header_file}")
if (NOT EXISTS "${full_path_api_header_file}")
message(FATAL_ERROR "API header file \"${full_path_api_header_file}\" does not exist")
endif()
endforeach()
################################################################################ ################################################################################
# Traverse directories each adding a Z3 component # Traverse directories each adding a Z3 component
################################################################################ ################################################################################
@ -305,7 +279,7 @@ endif()
################################################################################ ################################################################################
cmake_dependent_option(Z3_BUILD_EXECUTABLE cmake_dependent_option(Z3_BUILD_EXECUTABLE
"Build the z3 executable" ON "Build the z3 executable" ON
"CMAKE_SOURCE_DIR STREQUAL PROJECT_SOURCE_DIR" OFF) "CMAKE_SOURCE_DIR STREQUAL PROJECT_SOURCE_DIR;Z3_BUILD_LIBZ3_CORE" OFF)
if (Z3_BUILD_EXECUTABLE) if (Z3_BUILD_EXECUTABLE)
add_subdirectory(shell) add_subdirectory(shell)
@ -317,26 +291,13 @@ endif()
cmake_dependent_option(Z3_BUILD_TEST_EXECUTABLES cmake_dependent_option(Z3_BUILD_TEST_EXECUTABLES
"Build test executables" ON "Build test executables" ON
"CMAKE_SOURCE_DIR STREQUAL PROJECT_SOURCE_DIR" OFF) "CMAKE_SOURCE_DIR STREQUAL PROJECT_SOURCE_DIR;Z3_BUILD_LIBZ3_CORE" OFF)
if (Z3_BUILD_TEST_EXECUTABLES) if (Z3_BUILD_TEST_EXECUTABLES)
add_subdirectory(test) add_subdirectory(test)
endif() endif()
################################################################################
# Z3 API bindings
################################################################################
option(Z3_BUILD_PYTHON_BINDINGS "Build Python bindings for Z3" OFF)
if (Z3_BUILD_PYTHON_BINDINGS)
if (NOT Z3_BUILD_LIBZ3_SHARED)
message(FATAL_ERROR "The python bindings will not work with a static libz3. "
"You either need to disable Z3_BUILD_PYTHON_BINDINGS or enable Z3_BUILD_LIBZ3_SHARED")
endif()
add_subdirectory(api/python)
endif()
################################################################################ ################################################################################
# .NET bindings # .NET bindings
################################################################################ ################################################################################

View file

@ -156,8 +156,15 @@ extern "C" {
} }
bool Z3_API Z3_is_algebraic_number(Z3_context c, Z3_ast a) { bool Z3_API Z3_is_algebraic_number(Z3_context c, Z3_ast a) {
Z3_TRY;
LOG_Z3_is_algebraic_number(c, a); LOG_Z3_is_algebraic_number(c, a);
RESET_ERROR_CODE();
if (!is_expr(a)) {
SET_ERROR_CODE(Z3_INVALID_ARG, nullptr);
return false;
}
return mk_c(c)->autil().is_irrational_algebraic_numeral(to_expr(a)); return mk_c(c)->autil().is_irrational_algebraic_numeral(to_expr(a));
Z3_CATCH_RETURN(false);
} }
Z3_ast Z3_API Z3_get_algebraic_number_lower(Z3_context c, Z3_ast a, unsigned precision) { Z3_ast Z3_API Z3_get_algebraic_number_lower(Z3_context c, Z3_ast a, unsigned precision) {

View file

@ -268,7 +268,6 @@ extern "C" {
MK_UNARY(Z3_mk_set_complement, mk_c(c)->get_array_fid(), OP_SET_COMPLEMENT, SKIP); MK_UNARY(Z3_mk_set_complement, mk_c(c)->get_array_fid(), OP_SET_COMPLEMENT, SKIP);
MK_BINARY(Z3_mk_set_subset, mk_c(c)->get_array_fid(), OP_SET_SUBSET, SKIP); MK_BINARY(Z3_mk_set_subset, mk_c(c)->get_array_fid(), OP_SET_SUBSET, SKIP);
MK_BINARY(Z3_mk_array_ext, mk_c(c)->get_array_fid(), OP_ARRAY_EXT, SKIP); MK_BINARY(Z3_mk_array_ext, mk_c(c)->get_array_fid(), OP_ARRAY_EXT, SKIP);
MK_BINARY(Z3_mk_set_has_size, mk_c(c)->get_array_fid(), OP_SET_HAS_SIZE, SKIP);
Z3_ast Z3_API Z3_mk_as_array(Z3_context c, Z3_func_decl f) { Z3_ast Z3_API Z3_mk_as_array(Z3_context c, Z3_func_decl f) {
Z3_TRY; Z3_TRY;

View file

@ -1192,8 +1192,6 @@ extern "C" {
case OP_SET_SUBSET: return Z3_OP_SET_SUBSET; case OP_SET_SUBSET: return Z3_OP_SET_SUBSET;
case OP_AS_ARRAY: return Z3_OP_AS_ARRAY; case OP_AS_ARRAY: return Z3_OP_AS_ARRAY;
case OP_ARRAY_EXT: return Z3_OP_ARRAY_EXT; case OP_ARRAY_EXT: return Z3_OP_ARRAY_EXT;
case OP_SET_CARD: return Z3_OP_SET_CARD;
case OP_SET_HAS_SIZE: return Z3_OP_SET_HAS_SIZE;
default: default:
return Z3_OP_INTERNAL; return Z3_OP_INTERNAL;
} }

View file

@ -896,7 +896,7 @@ extern "C" {
Z3_CATCH_RETURN(0); Z3_CATCH_RETURN(0);
} }
bool Z3_API Z3_fpa_get_numeral_sign(Z3_context c, Z3_ast t, int * sgn) { bool Z3_API Z3_fpa_get_numeral_sign(Z3_context c, Z3_ast t, bool * sgn) {
Z3_TRY; Z3_TRY;
LOG_Z3_fpa_get_numeral_sign(c, t, sgn); LOG_Z3_fpa_get_numeral_sign(c, t, sgn);
RESET_ERROR_CODE(); RESET_ERROR_CODE();
@ -1224,6 +1224,20 @@ extern "C" {
Z3_CATCH_RETURN(nullptr); Z3_CATCH_RETURN(nullptr);
} }
bool Z3_API Z3_fpa_is_numeral(Z3_context c, Z3_ast t) {
Z3_TRY;
LOG_Z3_fpa_is_numeral(c, t);
RESET_ERROR_CODE();
api::context * ctx = mk_c(c);
fpa_util & fu = ctx->fpautil();
if (!is_expr(t)) {
SET_ERROR_CODE(Z3_INVALID_ARG, nullptr);
return false;
}
return fu.is_numeral(to_expr(t));
Z3_CATCH_RETURN(false);
}
bool Z3_API Z3_fpa_is_numeral_nan(Z3_context c, Z3_ast t) { bool Z3_API Z3_fpa_is_numeral_nan(Z3_context c, Z3_ast t) {
Z3_TRY; Z3_TRY;
LOG_Z3_fpa_is_numeral_nan(c, t); LOG_Z3_fpa_is_numeral_nan(c, t);

View file

@ -481,4 +481,22 @@ extern "C" {
Z3_CATCH; Z3_CATCH;
} }
Z3_optimize Z3_API Z3_optimize_translate(Z3_context c, Z3_optimize o, Z3_context target) {
Z3_TRY;
LOG_Z3_optimize_translate(c, o, target);
RESET_ERROR_CODE();
// Translate the opt::context to the target manager
opt::context* translated_ctx = to_optimize_ptr(o)->translate(mk_c(target)->m());
// Create a new Z3_optimize_ref in the target context
Z3_optimize_ref* result_ref = alloc(Z3_optimize_ref, *mk_c(target));
result_ref->m_opt = translated_ctx;
mk_c(target)->save_object(result_ref);
Z3_optimize result = of_optimize(result_ref);
RETURN_Z3(result);
Z3_CATCH_RETURN(nullptr);
}
}; };

View file

@ -385,7 +385,7 @@ extern "C" {
Z3_CATCH_RETURN(nullptr); Z3_CATCH_RETURN(nullptr);
} }
int Z3_API Z3_rcf_interval(Z3_context c, Z3_rcf_num a, int * lower_is_inf, int * lower_is_open, Z3_rcf_num * lower, int * upper_is_inf, int * upper_is_open, Z3_rcf_num * upper) { int Z3_API Z3_rcf_interval(Z3_context c, Z3_rcf_num a, bool * lower_is_inf, bool * lower_is_open, Z3_rcf_num * lower, bool * upper_is_inf, bool * upper_is_open, Z3_rcf_num * upper) {
Z3_TRY; Z3_TRY;
LOG_Z3_rcf_interval(c, a, lower_is_inf, lower_is_open, lower, upper_is_inf, upper_is_open, upper); LOG_Z3_rcf_interval(c, a, lower_is_inf, lower_is_open, lower, upper_is_inf, upper_is_open, upper);
RESET_ERROR_CODE(); RESET_ERROR_CODE();

View file

@ -3313,6 +3313,7 @@ namespace z3 {
Z3_optimize m_opt; Z3_optimize m_opt;
public: public:
struct translate {};
class handle final { class handle final {
unsigned m_h; unsigned m_h;
public: public:
@ -3320,6 +3321,12 @@ namespace z3 {
unsigned h() const { return m_h; } unsigned h() const { return m_h; }
}; };
optimize(context& c):object(c) { m_opt = Z3_mk_optimize(c); Z3_optimize_inc_ref(c, m_opt); } optimize(context& c):object(c) { m_opt = Z3_mk_optimize(c); Z3_optimize_inc_ref(c, m_opt); }
optimize(context & c, optimize const& src, translate): object(c) {
Z3_optimize o = Z3_optimize_translate(src.ctx(), src, c);
check_error();
m_opt = o;
Z3_optimize_inc_ref(c, m_opt);
}
optimize(optimize const & o):object(o), m_opt(o.m_opt) { optimize(optimize const & o):object(o), m_opt(o.m_opt) {
Z3_optimize_inc_ref(o.ctx(), o.m_opt); Z3_optimize_inc_ref(o.ctx(), o.m_opt);
} }

View file

@ -50,8 +50,8 @@ namespace Microsoft.Z3
{ {
get get
{ {
int res = 0; byte res = 0;
if (Native.Z3_fpa_get_numeral_sign(Context.nCtx, NativeObject, ref res) == 0) if (0 == Native.Z3_fpa_get_numeral_sign(Context.nCtx, NativeObject, ref res))
throw new Z3Exception("Sign is not a Boolean value"); throw new Z3Exception("Sign is not a Boolean value");
return res != 0; return res != 0;
} }

View file

@ -41,7 +41,7 @@ namespace Microsoft.Z3
public static bool Open(string filename) public static bool Open(string filename)
{ {
m_is_open = true; m_is_open = true;
return Native.Z3_open_log(filename) == 1; return 0 != Native.Z3_open_log(filename);
} }
/// <summary> /// <summary>

View file

@ -156,7 +156,7 @@ namespace Microsoft.Z3
/// <remarks> /// <remarks>
/// This API is an alternative to <see cref="Check(Expr[])"/> with assumptions for extracting unsat cores. /// This API is an alternative to <see cref="Check(Expr[])"/> with assumptions for extracting unsat cores.
/// Both APIs can be used in the same solver. The unsat core will contain a combination /// Both APIs can be used in the same solver. The unsat core will contain a combination
/// of the Boolean variables provided using <see cref="AssertAndTrack(BoolExpr[],BoolExpr[])"/> /// of the Boolean variables provided using <see cref="AssertAndTrack(BoolExpr,BoolExpr)"/>
/// and the Boolean literals /// and the Boolean literals
/// provided using <see cref="Check(Expr[])"/> with assumptions. /// provided using <see cref="Check(Expr[])"/> with assumptions.
/// </remarks> /// </remarks>

View file

@ -27,10 +27,10 @@ public class FPNum extends FPExpr
* @throws Z3Exception * @throws Z3Exception
*/ */
public boolean getSign() { public boolean getSign() {
Native.IntPtr res = new Native.IntPtr(); Native.BoolPtr res = new Native.BoolPtr();
if (!Native.fpaGetNumeralSign(getContext().nCtx(), getNativeObject(), res)) if (!Native.fpaGetNumeralSign(getContext().nCtx(), getNativeObject(), res))
throw new Z3Exception("Sign is not a Boolean value"); throw new Z3Exception("Sign is not a Boolean value");
return res.value != 0; return res.value;
} }
/** /**

View file

@ -36,7 +36,7 @@ public final class Log
public static boolean open(String filename) public static boolean open(String filename)
{ {
m_is_open = true; m_is_open = true;
return Native.openLog(filename) == 1; return Native.openLog(filename);
} }
/** /**

View file

@ -46,12 +46,15 @@
} }
}, },
"node_modules/@babel/code-frame": { "node_modules/@babel/code-frame": {
"version": "7.18.6", "version": "7.27.1",
"resolved": "https://registry.npmjs.org/@babel/code-frame/-/code-frame-7.18.6.tgz", "resolved": "https://registry.npmjs.org/@babel/code-frame/-/code-frame-7.27.1.tgz",
"integrity": "sha512-TDCmlK5eOvH+eH7cdAFlNXeVJqWIQ7gW9tY1GJIpUtFb6CmjVyq2VM3u71bOyR8CRihcCgMUYoDNyLXao3+70Q==", "integrity": "sha512-cjQ7ZlQ0Mv3b47hABuTevyTuYN4i+loJKGeV9flcCgIK37cCXRh+L1bd3iBHlynerhQ7BhCkn2BPbQUL+rGqFg==",
"dev": true, "dev": true,
"license": "MIT",
"dependencies": { "dependencies": {
"@babel/highlight": "^7.18.6" "@babel/helper-validator-identifier": "^7.27.1",
"js-tokens": "^4.0.0",
"picocolors": "^1.1.1"
}, },
"engines": { "engines": {
"node": ">=6.9.0" "node": ">=6.9.0"
@ -236,19 +239,21 @@
} }
}, },
"node_modules/@babel/helper-string-parser": { "node_modules/@babel/helper-string-parser": {
"version": "7.19.4", "version": "7.27.1",
"resolved": "https://registry.npmjs.org/@babel/helper-string-parser/-/helper-string-parser-7.19.4.tgz", "resolved": "https://registry.npmjs.org/@babel/helper-string-parser/-/helper-string-parser-7.27.1.tgz",
"integrity": "sha512-nHtDoQcuqFmwYNYPz3Rah5ph2p8PFeFCsZk9A/48dPc/rGocJ5J3hAAZ7pb76VWX3fZKu+uEr/FhH5jLx7umrw==", "integrity": "sha512-qMlSxKbpRlAridDExk92nSobyDdpPijUq2DW6oDnUqd0iOGxmQjyqhMIihI9+zv4LPyZdRje2cavWPbCbWm3eA==",
"dev": true, "dev": true,
"license": "MIT",
"engines": { "engines": {
"node": ">=6.9.0" "node": ">=6.9.0"
} }
}, },
"node_modules/@babel/helper-validator-identifier": { "node_modules/@babel/helper-validator-identifier": {
"version": "7.19.1", "version": "7.28.5",
"resolved": "https://registry.npmjs.org/@babel/helper-validator-identifier/-/helper-validator-identifier-7.19.1.tgz", "resolved": "https://registry.npmjs.org/@babel/helper-validator-identifier/-/helper-validator-identifier-7.28.5.tgz",
"integrity": "sha512-awrNfaMtnHUr653GgGEs++LlAvW6w+DcPrOliSMXWCKo597CwL5Acf/wWdNkf/tfEQE3mjkeD1YOVZOUV/od1w==", "integrity": "sha512-qSs4ifwzKJSV39ucNjsvc6WVHs6b7S03sOh2OcHF9UHfVPqWWALUsNUVzhSBiItjRZoLHx7nIarVjqKVusUZ1Q==",
"dev": true, "dev": true,
"license": "MIT",
"engines": { "engines": {
"node": ">=6.9.0" "node": ">=6.9.0"
} }
@ -263,38 +268,28 @@
} }
}, },
"node_modules/@babel/helpers": { "node_modules/@babel/helpers": {
"version": "7.19.4", "version": "7.28.4",
"resolved": "https://registry.npmjs.org/@babel/helpers/-/helpers-7.19.4.tgz", "resolved": "https://registry.npmjs.org/@babel/helpers/-/helpers-7.28.4.tgz",
"integrity": "sha512-G+z3aOx2nfDHwX/kyVii5fJq+bgscg89/dJNWpYeKeBv3v9xX8EIabmx1k6u9LS04H7nROFVRVK+e3k0VHp+sw==", "integrity": "sha512-HFN59MmQXGHVyYadKLVumYsA9dBFun/ldYxipEjzA4196jpLZd8UjEEBLkbEkvfYreDqJhZxYAWFPtrfhNpj4w==",
"dev": true, "dev": true,
"license": "MIT",
"dependencies": { "dependencies": {
"@babel/template": "^7.18.10", "@babel/template": "^7.27.2",
"@babel/traverse": "^7.19.4", "@babel/types": "^7.28.4"
"@babel/types": "^7.19.4"
},
"engines": {
"node": ">=6.9.0"
}
},
"node_modules/@babel/highlight": {
"version": "7.18.6",
"resolved": "https://registry.npmjs.org/@babel/highlight/-/highlight-7.18.6.tgz",
"integrity": "sha512-u7stbOuYjaPezCuLj29hNW1v64M2Md2qupEKP1fHc7WdOA3DgLh37suiSrZYY7haUB7iBeQZ9P1uiRF359do3g==",
"dev": true,
"dependencies": {
"@babel/helper-validator-identifier": "^7.18.6",
"chalk": "^2.0.0",
"js-tokens": "^4.0.0"
}, },
"engines": { "engines": {
"node": ">=6.9.0" "node": ">=6.9.0"
} }
}, },
"node_modules/@babel/parser": { "node_modules/@babel/parser": {
"version": "7.19.4", "version": "7.28.5",
"resolved": "https://registry.npmjs.org/@babel/parser/-/parser-7.19.4.tgz", "resolved": "https://registry.npmjs.org/@babel/parser/-/parser-7.28.5.tgz",
"integrity": "sha512-qpVT7gtuOLjWeDTKLkJ6sryqLliBaFpAtGeqw5cs5giLldvh+Ch0plqnUMKoVAUS6ZEueQQiZV+p5pxtPitEsA==", "integrity": "sha512-KKBU1VGYR7ORr3At5HAtUQ+TV3SzRCXmA/8OdDZiLDBIZxVyzXuztPjfLd3BV1PRAQGCMWWSHYhL0F8d5uHBDQ==",
"dev": true, "dev": true,
"license": "MIT",
"dependencies": {
"@babel/types": "^7.28.5"
},
"bin": { "bin": {
"parser": "bin/babel-parser.js" "parser": "bin/babel-parser.js"
}, },
@ -465,26 +460,25 @@
} }
}, },
"node_modules/@babel/runtime": { "node_modules/@babel/runtime": {
"version": "7.19.4", "version": "7.28.4",
"resolved": "https://registry.npmjs.org/@babel/runtime/-/runtime-7.19.4.tgz", "resolved": "https://registry.npmjs.org/@babel/runtime/-/runtime-7.28.4.tgz",
"integrity": "sha512-EXpLCrk55f+cYqmHsSR+yD/0gAIMxxA9QK9lnQWzhMCvt+YmoBN7Zx94s++Kv0+unHk39vxNO8t+CMA2WSS3wA==", "integrity": "sha512-Q/N6JNWvIvPnLDvjlE1OUBLPQHH6l3CltCEsHIujp45zQUSSh8K+gHnaEX45yAT1nyngnINhvWtzN+Nb9D8RAQ==",
"dev": true, "dev": true,
"dependencies": { "license": "MIT",
"regenerator-runtime": "^0.13.4"
},
"engines": { "engines": {
"node": ">=6.9.0" "node": ">=6.9.0"
} }
}, },
"node_modules/@babel/template": { "node_modules/@babel/template": {
"version": "7.18.10", "version": "7.27.2",
"resolved": "https://registry.npmjs.org/@babel/template/-/template-7.18.10.tgz", "resolved": "https://registry.npmjs.org/@babel/template/-/template-7.27.2.tgz",
"integrity": "sha512-TI+rCtooWHr3QJ27kJxfjutghu44DLnasDMwpDqCXVTal9RLp3RSYNh4NdBrRP2cQAoG9A8juOQl6P6oZG4JxA==", "integrity": "sha512-LPDZ85aEJyYSd18/DkjNh4/y1ntkE5KwUHWTiqgRxruuZL2F1yuHligVHLvcHY2vMHXttKFpJn6LwfI7cw7ODw==",
"dev": true, "dev": true,
"license": "MIT",
"dependencies": { "dependencies": {
"@babel/code-frame": "^7.18.6", "@babel/code-frame": "^7.27.1",
"@babel/parser": "^7.18.10", "@babel/parser": "^7.27.2",
"@babel/types": "^7.18.10" "@babel/types": "^7.27.1"
}, },
"engines": { "engines": {
"node": ">=6.9.0" "node": ">=6.9.0"
@ -511,19 +505,6 @@
"node": ">=6.9.0" "node": ">=6.9.0"
} }
}, },
"node_modules/@babel/traverse/node_modules/@babel/code-frame": {
"version": "7.22.13",
"resolved": "https://registry.npmjs.org/@babel/code-frame/-/code-frame-7.22.13.tgz",
"integrity": "sha512-XktuhWlJ5g+3TJXc5upd9Ks1HutSArik6jf2eAjYFyIOf4ej3RN+184cZbzDvbPnuTJIUhPKKJE3cIsYTiAT3w==",
"dev": true,
"dependencies": {
"@babel/highlight": "^7.22.13",
"chalk": "^2.4.2"
},
"engines": {
"node": ">=6.9.0"
}
},
"node_modules/@babel/traverse/node_modules/@babel/generator": { "node_modules/@babel/traverse/node_modules/@babel/generator": {
"version": "7.23.0", "version": "7.23.0",
"resolved": "https://registry.npmjs.org/@babel/generator/-/generator-7.23.0.tgz", "resolved": "https://registry.npmjs.org/@babel/generator/-/generator-7.23.0.tgz",
@ -585,78 +566,6 @@
"node": ">=6.9.0" "node": ">=6.9.0"
} }
}, },
"node_modules/@babel/traverse/node_modules/@babel/helper-string-parser": {
"version": "7.22.5",
"resolved": "https://registry.npmjs.org/@babel/helper-string-parser/-/helper-string-parser-7.22.5.tgz",
"integrity": "sha512-mM4COjgZox8U+JcXQwPijIZLElkgEpO5rsERVDJTc2qfCDfERyob6k5WegS14SX18IIjv+XD+GrqNumY5JRCDw==",
"dev": true,
"engines": {
"node": ">=6.9.0"
}
},
"node_modules/@babel/traverse/node_modules/@babel/helper-validator-identifier": {
"version": "7.22.20",
"resolved": "https://registry.npmjs.org/@babel/helper-validator-identifier/-/helper-validator-identifier-7.22.20.tgz",
"integrity": "sha512-Y4OZ+ytlatR8AI+8KZfKuL5urKp7qey08ha31L8b3BwewJAoJamTzyvxPR/5D+KkdJCGPq/+8TukHBlY10FX9A==",
"dev": true,
"engines": {
"node": ">=6.9.0"
}
},
"node_modules/@babel/traverse/node_modules/@babel/highlight": {
"version": "7.22.20",
"resolved": "https://registry.npmjs.org/@babel/highlight/-/highlight-7.22.20.tgz",
"integrity": "sha512-dkdMCN3py0+ksCgYmGG8jKeGA/8Tk+gJwSYYlFGxG5lmhfKNoAy004YpLxpS1W2J8m/EK2Ew+yOs9pVRwO89mg==",
"dev": true,
"dependencies": {
"@babel/helper-validator-identifier": "^7.22.20",
"chalk": "^2.4.2",
"js-tokens": "^4.0.0"
},
"engines": {
"node": ">=6.9.0"
}
},
"node_modules/@babel/traverse/node_modules/@babel/parser": {
"version": "7.23.0",
"resolved": "https://registry.npmjs.org/@babel/parser/-/parser-7.23.0.tgz",
"integrity": "sha512-vvPKKdMemU85V9WE/l5wZEmImpCtLqbnTvqDS2U1fJ96KrxoW7KrXhNsNCblQlg8Ck4b85yxdTyelsMUgFUXiw==",
"dev": true,
"bin": {
"parser": "bin/babel-parser.js"
},
"engines": {
"node": ">=6.0.0"
}
},
"node_modules/@babel/traverse/node_modules/@babel/template": {
"version": "7.22.15",
"resolved": "https://registry.npmjs.org/@babel/template/-/template-7.22.15.tgz",
"integrity": "sha512-QPErUVm4uyJa60rkI73qneDacvdvzxshT3kksGqlGWYdOTIUOwJ7RDUL8sGqslY1uXWSL6xMFKEXDS3ox2uF0w==",
"dev": true,
"dependencies": {
"@babel/code-frame": "^7.22.13",
"@babel/parser": "^7.22.15",
"@babel/types": "^7.22.15"
},
"engines": {
"node": ">=6.9.0"
}
},
"node_modules/@babel/traverse/node_modules/@babel/types": {
"version": "7.23.0",
"resolved": "https://registry.npmjs.org/@babel/types/-/types-7.23.0.tgz",
"integrity": "sha512-0oIyUfKoI3mSqMvsxBdclDwxXKXAUA8v/apZbc+iSyARYou1o8ZGDxbUYyLFoW2arqS2jDGqJuZvv1d/io1axg==",
"dev": true,
"dependencies": {
"@babel/helper-string-parser": "^7.22.5",
"@babel/helper-validator-identifier": "^7.22.20",
"to-fast-properties": "^2.0.0"
},
"engines": {
"node": ">=6.9.0"
}
},
"node_modules/@babel/traverse/node_modules/@jridgewell/gen-mapping": { "node_modules/@babel/traverse/node_modules/@jridgewell/gen-mapping": {
"version": "0.3.3", "version": "0.3.3",
"resolved": "https://registry.npmjs.org/@jridgewell/gen-mapping/-/gen-mapping-0.3.3.tgz", "resolved": "https://registry.npmjs.org/@jridgewell/gen-mapping/-/gen-mapping-0.3.3.tgz",
@ -672,14 +581,14 @@
} }
}, },
"node_modules/@babel/types": { "node_modules/@babel/types": {
"version": "7.19.4", "version": "7.28.5",
"resolved": "https://registry.npmjs.org/@babel/types/-/types-7.19.4.tgz", "resolved": "https://registry.npmjs.org/@babel/types/-/types-7.28.5.tgz",
"integrity": "sha512-M5LK7nAeS6+9j7hAq+b3fQs+pNfUtTGq+yFFfHnauFA8zQtLRfmuipmsKDKKLuyG+wC8ABW43A153YNawNTEtw==", "integrity": "sha512-qQ5m48eI/MFLQ5PxQj4PFaprjyCTLI37ElWMmNs0K8Lk3dVeOdNpB3ks8jc7yM5CDmVC73eMVk/trk3fgmrUpA==",
"dev": true, "dev": true,
"license": "MIT",
"dependencies": { "dependencies": {
"@babel/helper-string-parser": "^7.19.4", "@babel/helper-string-parser": "^7.27.1",
"@babel/helper-validator-identifier": "^7.19.1", "@babel/helper-validator-identifier": "^7.28.5"
"to-fast-properties": "^2.0.0"
}, },
"engines": { "engines": {
"node": ">=6.9.0" "node": ">=6.9.0"
@ -1968,10 +1877,11 @@
"dev": true "dev": true
}, },
"node_modules/brace-expansion": { "node_modules/brace-expansion": {
"version": "1.1.11", "version": "1.1.12",
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.11.tgz", "resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.12.tgz",
"integrity": "sha512-iCuPHDFgrHX7H2vEI/5xpz07zSHB00TpugqhmYtVmMO6518mCuRMoOYFldEBl0g187ufozdaHgWKcYFb61qGiA==", "integrity": "sha512-9T9UjW3r0UW5c1Q7GTwllptXwhvYmEzFhzMfZ9H7FQWt+uZePjZPjBP/W1ZEyZ1twGWom5/56TF4lPcqjnDHcg==",
"dev": true, "dev": true,
"license": "MIT",
"dependencies": { "dependencies": {
"balanced-match": "^1.0.0", "balanced-match": "^1.0.0",
"concat-map": "0.0.1" "concat-map": "0.0.1"
@ -2250,10 +2160,11 @@
"dev": true "dev": true
}, },
"node_modules/cross-spawn": { "node_modules/cross-spawn": {
"version": "6.0.5", "version": "6.0.6",
"resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-6.0.5.tgz", "resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-6.0.6.tgz",
"integrity": "sha512-eTVLrBSt7fjbDygz805pMnstIs2VTBNkRm0qxZd+M7A5XDdxVRWO5MxGBXZhjY4cqLYLdtrGqRf8mBPmzwSpWQ==", "integrity": "sha512-VqCUuhcd1iB+dsv8gxPttb5iZh/D0iubSP21g36KXdEuf6I5JiioesUVjpCdHV9MZRUfVFlvwtIUyPfxo5trtw==",
"dev": true, "dev": true,
"license": "MIT",
"dependencies": { "dependencies": {
"nice-try": "^1.0.4", "nice-try": "^1.0.4",
"path-key": "^2.0.1", "path-key": "^2.0.1",
@ -2505,10 +2416,11 @@
} }
}, },
"node_modules/execa/node_modules/cross-spawn": { "node_modules/execa/node_modules/cross-spawn": {
"version": "7.0.3", "version": "7.0.6",
"resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.3.tgz", "resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.6.tgz",
"integrity": "sha512-iRDPJKUPVEND7dHPO8rkbOnPpyDygcDFtWjpeWNCgy8WP2rXcxXL8TskReQl6OrB2G7+UJrags1q15Fudc7G6w==", "integrity": "sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA==",
"dev": true, "dev": true,
"license": "MIT",
"dependencies": { "dependencies": {
"path-key": "^3.1.0", "path-key": "^3.1.0",
"shebang-command": "^2.0.0", "shebang-command": "^2.0.0",
@ -3645,6 +3557,117 @@
"node": ">=8" "node": ">=8"
} }
}, },
"node_modules/jest-cli": {
"version": "28.1.3",
"resolved": "https://registry.npmjs.org/jest-cli/-/jest-cli-28.1.3.tgz",
"integrity": "sha512-roY3kvrv57Azn1yPgdTebPAXvdR2xfezaKKYzVxZ6It/5NCxzJym6tUI5P1zkdWhfUYkxEI9uZWcQdaFLo8mJQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"@jest/core": "^28.1.3",
"@jest/test-result": "^28.1.3",
"@jest/types": "^28.1.3",
"chalk": "^4.0.0",
"exit": "^0.1.2",
"graceful-fs": "^4.2.9",
"import-local": "^3.0.2",
"jest-config": "^28.1.3",
"jest-util": "^28.1.3",
"jest-validate": "^28.1.3",
"prompts": "^2.0.1",
"yargs": "^17.3.1"
},
"bin": {
"jest": "bin/jest.js"
},
"engines": {
"node": "^12.13.0 || ^14.15.0 || ^16.10.0 || >=17.0.0"
},
"peerDependencies": {
"node-notifier": "^8.0.1 || ^9.0.0 || ^10.0.0"
},
"peerDependenciesMeta": {
"node-notifier": {
"optional": true
}
}
},
"node_modules/jest-cli/node_modules/ansi-styles": {
"version": "4.3.0",
"resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.3.0.tgz",
"integrity": "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==",
"dev": true,
"license": "MIT",
"dependencies": {
"color-convert": "^2.0.1"
},
"engines": {
"node": ">=8"
},
"funding": {
"url": "https://github.com/chalk/ansi-styles?sponsor=1"
}
},
"node_modules/jest-cli/node_modules/chalk": {
"version": "4.1.2",
"resolved": "https://registry.npmjs.org/chalk/-/chalk-4.1.2.tgz",
"integrity": "sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==",
"dev": true,
"license": "MIT",
"dependencies": {
"ansi-styles": "^4.1.0",
"supports-color": "^7.1.0"
},
"engines": {
"node": ">=10"
},
"funding": {
"url": "https://github.com/chalk/chalk?sponsor=1"
}
},
"node_modules/jest-cli/node_modules/color-convert": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
"integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"color-name": "~1.1.4"
},
"engines": {
"node": ">=7.0.0"
}
},
"node_modules/jest-cli/node_modules/color-name": {
"version": "1.1.4",
"resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz",
"integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==",
"dev": true,
"license": "MIT"
},
"node_modules/jest-cli/node_modules/has-flag": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/has-flag/-/has-flag-4.0.0.tgz",
"integrity": "sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=8"
}
},
"node_modules/jest-cli/node_modules/supports-color": {
"version": "7.2.0",
"resolved": "https://registry.npmjs.org/supports-color/-/supports-color-7.2.0.tgz",
"integrity": "sha512-qpCAvRl9stuOHveKsn7HncJRvv501qIacKzQlO/+Lwxc9+0q2wLyv4Dfvt80/DPn2pqOBsJdDiogXGR9+OvwRw==",
"dev": true,
"license": "MIT",
"dependencies": {
"has-flag": "^4.0.0"
},
"engines": {
"node": ">=8"
}
},
"node_modules/jest-config": { "node_modules/jest-config": {
"version": "28.1.3", "version": "28.1.3",
"resolved": "https://registry.npmjs.org/jest-config/-/jest-config-28.1.3.tgz", "resolved": "https://registry.npmjs.org/jest-config/-/jest-config-28.1.3.tgz",
@ -5283,110 +5306,6 @@
"url": "https://github.com/chalk/supports-color?sponsor=1" "url": "https://github.com/chalk/supports-color?sponsor=1"
} }
}, },
"node_modules/jest/node_modules/ansi-styles": {
"version": "4.3.0",
"resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.3.0.tgz",
"integrity": "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==",
"dev": true,
"dependencies": {
"color-convert": "^2.0.1"
},
"engines": {
"node": ">=8"
},
"funding": {
"url": "https://github.com/chalk/ansi-styles?sponsor=1"
}
},
"node_modules/jest/node_modules/chalk": {
"version": "4.1.2",
"resolved": "https://registry.npmjs.org/chalk/-/chalk-4.1.2.tgz",
"integrity": "sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==",
"dev": true,
"dependencies": {
"ansi-styles": "^4.1.0",
"supports-color": "^7.1.0"
},
"engines": {
"node": ">=10"
},
"funding": {
"url": "https://github.com/chalk/chalk?sponsor=1"
}
},
"node_modules/jest/node_modules/color-convert": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
"integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==",
"dev": true,
"dependencies": {
"color-name": "~1.1.4"
},
"engines": {
"node": ">=7.0.0"
}
},
"node_modules/jest/node_modules/color-name": {
"version": "1.1.4",
"resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz",
"integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==",
"dev": true
},
"node_modules/jest/node_modules/has-flag": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/has-flag/-/has-flag-4.0.0.tgz",
"integrity": "sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ==",
"dev": true,
"engines": {
"node": ">=8"
}
},
"node_modules/jest/node_modules/jest-cli": {
"version": "28.1.3",
"resolved": "https://registry.npmjs.org/jest-cli/-/jest-cli-28.1.3.tgz",
"integrity": "sha512-roY3kvrv57Azn1yPgdTebPAXvdR2xfezaKKYzVxZ6It/5NCxzJym6tUI5P1zkdWhfUYkxEI9uZWcQdaFLo8mJQ==",
"dev": true,
"dependencies": {
"@jest/core": "^28.1.3",
"@jest/test-result": "^28.1.3",
"@jest/types": "^28.1.3",
"chalk": "^4.0.0",
"exit": "^0.1.2",
"graceful-fs": "^4.2.9",
"import-local": "^3.0.2",
"jest-config": "^28.1.3",
"jest-util": "^28.1.3",
"jest-validate": "^28.1.3",
"prompts": "^2.0.1",
"yargs": "^17.3.1"
},
"bin": {
"jest": "bin/jest.js"
},
"engines": {
"node": "^12.13.0 || ^14.15.0 || ^16.10.0 || >=17.0.0"
},
"peerDependencies": {
"node-notifier": "^8.0.1 || ^9.0.0 || ^10.0.0"
},
"peerDependenciesMeta": {
"node-notifier": {
"optional": true
}
}
},
"node_modules/jest/node_modules/supports-color": {
"version": "7.2.0",
"resolved": "https://registry.npmjs.org/supports-color/-/supports-color-7.2.0.tgz",
"integrity": "sha512-qpCAvRl9stuOHveKsn7HncJRvv501qIacKzQlO/+Lwxc9+0q2wLyv4Dfvt80/DPn2pqOBsJdDiogXGR9+OvwRw==",
"dev": true,
"dependencies": {
"has-flag": "^4.0.0"
},
"engines": {
"node": ">=8"
}
},
"node_modules/js-tokens": { "node_modules/js-tokens": {
"version": "4.0.0", "version": "4.0.0",
"resolved": "https://registry.npmjs.org/js-tokens/-/js-tokens-4.0.0.tgz", "resolved": "https://registry.npmjs.org/js-tokens/-/js-tokens-4.0.0.tgz",
@ -5394,10 +5313,11 @@
"dev": true "dev": true
}, },
"node_modules/js-yaml": { "node_modules/js-yaml": {
"version": "3.14.1", "version": "3.14.2",
"resolved": "https://registry.npmjs.org/js-yaml/-/js-yaml-3.14.1.tgz", "resolved": "https://registry.npmjs.org/js-yaml/-/js-yaml-3.14.2.tgz",
"integrity": "sha512-okMH7OXXJ7YrN9Ok3/SXrnu4iX9yOk+25nqX4imS2npuvTYDmo/QEZoqwZkYaIDk3jVvBOTOIEgEhaLOynBS9g==", "integrity": "sha512-PMSmkqxr106Xa156c2M265Z+FTrPl+oxd/rgOQy2tijQeK5TxQ43psO1ZCwhVOSdnn+RzkzlRz/eY4BgJBYVpg==",
"dev": true, "dev": true,
"license": "MIT",
"dependencies": { "dependencies": {
"argparse": "^1.0.7", "argparse": "^1.0.7",
"esprima": "^4.0.0" "esprima": "^4.0.0"
@ -5914,10 +5834,11 @@
} }
}, },
"node_modules/picocolors": { "node_modules/picocolors": {
"version": "1.0.0", "version": "1.1.1",
"resolved": "https://registry.npmjs.org/picocolors/-/picocolors-1.0.0.tgz", "resolved": "https://registry.npmjs.org/picocolors/-/picocolors-1.1.1.tgz",
"integrity": "sha512-1fygroTLlHu66zi26VoTDv8yRgm0Fccecssto+MhsZ0D/DGW2sm8E8AjW7NU5VVTRt5GxbeZ5qBuJr+HyLYkjQ==", "integrity": "sha512-xceH2snhtb5M9liqDsmEw56le376mTZkEX/jEb/RxNFyegNul7eNslCXP9FDj/Lcu0X8KEyMceP2ntpaHrDEVA==",
"dev": true "dev": true,
"license": "ISC"
}, },
"node_modules/picomatch": { "node_modules/picomatch": {
"version": "2.3.1", "version": "2.3.1",
@ -6068,12 +5989,6 @@
"node": ">=6" "node": ">=6"
} }
}, },
"node_modules/regenerator-runtime": {
"version": "0.13.10",
"resolved": "https://registry.npmjs.org/regenerator-runtime/-/regenerator-runtime-0.13.10.tgz",
"integrity": "sha512-KepLsg4dU12hryUO7bp/axHAKvwGOCV0sGloQtpagJ12ai+ojVDqkeGSiRX1zlq+kjIMZ1t7gpze+26QqtdGqw==",
"dev": true
},
"node_modules/regexp.prototype.flags": { "node_modules/regexp.prototype.flags": {
"version": "1.4.3", "version": "1.4.3",
"resolved": "https://registry.npmjs.org/regexp.prototype.flags/-/regexp.prototype.flags-1.4.3.tgz", "resolved": "https://registry.npmjs.org/regexp.prototype.flags/-/regexp.prototype.flags-1.4.3.tgz",
@ -6537,15 +6452,6 @@
"integrity": "sha512-3f0uOEAQwIqGuWW2MVzYg8fV/QNnc/IpuJNG837rLuczAaLVHslWHZQj4IGiEl5Hs3kkbhwL9Ab7Hrsmuj+Smw==", "integrity": "sha512-3f0uOEAQwIqGuWW2MVzYg8fV/QNnc/IpuJNG837rLuczAaLVHslWHZQj4IGiEl5Hs3kkbhwL9Ab7Hrsmuj+Smw==",
"dev": true "dev": true
}, },
"node_modules/to-fast-properties": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/to-fast-properties/-/to-fast-properties-2.0.0.tgz",
"integrity": "sha512-/OaKK0xYrs3DmxRYqL/yDc+FxFUVYhDlXMhRmv3z915w2HF1tnN1omB354j8VUGO/hbRzyD6Y3sA7v7GS/ceog==",
"dev": true,
"engines": {
"node": ">=4"
}
},
"node_modules/to-regex-range": { "node_modules/to-regex-range": {
"version": "5.0.1", "version": "5.0.1",
"resolved": "https://registry.npmjs.org/to-regex-range/-/to-regex-range-5.0.1.tgz", "resolved": "https://registry.npmjs.org/to-regex-range/-/to-regex-range-5.0.1.tgz",
@ -6722,9 +6628,9 @@
} }
}, },
"node_modules/typedoc/node_modules/brace-expansion": { "node_modules/typedoc/node_modules/brace-expansion": {
"version": "2.0.1", "version": "2.0.2",
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-2.0.1.tgz", "resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-2.0.2.tgz",
"integrity": "sha512-XnAIvQ8eM+kC6aULx6wuQiwVsnzsi9d3WxzV3FpWTGA19F621kwdbsAcFKXgKUHZWsy+mY6iL1sHTxWEFCytDA==", "integrity": "sha512-Jt0vHyM+jmUBqojB7E1NIYadt0vI0Qxjxd2TErW94wDz+E2LAm5vKMXXwg6ZZBTHPuUlDgQHKXvjGBdfcF1ZDQ==",
"dev": true, "dev": true,
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {

View file

@ -1307,10 +1307,6 @@ export function createApi(Z3: Z3Core): Z3HighLevel {
return new SetImpl<ElemSort>(check(Z3.mk_set_difference(contextPtr, a.ast, b.ast))); return new SetImpl<ElemSort>(check(Z3.mk_set_difference(contextPtr, a.ast, b.ast)));
} }
function SetHasSize<ElemSort extends AnySort<Name>>(set: SMTSet<Name, ElemSort>, size: bigint | number | string | IntNum<Name>): Bool<Name> {
const a = typeof size === 'object'? Int.sort().cast(size) : Int.sort().cast(size);
return new BoolImpl(check(Z3.mk_set_has_size(contextPtr, set.ast, a.ast)));
}
function SetAdd<ElemSort extends AnySort<Name>>(set: SMTSet<Name, ElemSort>, elem: CoercibleToMap<SortToExprMap<ElemSort, Name>, Name>): SMTSet<Name, ElemSort> { function SetAdd<ElemSort extends AnySort<Name>>(set: SMTSet<Name, ElemSort>, elem: CoercibleToMap<SortToExprMap<ElemSort, Name>, Name>): SMTSet<Name, ElemSort> {
const arg = set.elemSort().cast(elem as any); const arg = set.elemSort().cast(elem as any);
@ -2644,9 +2640,6 @@ export function createApi(Z3: Z3Core): Z3HighLevel {
diff(b: SMTSet<Name, ElemSort>): SMTSet<Name, ElemSort> { diff(b: SMTSet<Name, ElemSort>): SMTSet<Name, ElemSort> {
return SetDifference(this, b); return SetDifference(this, b);
} }
hasSize(size: string | number | bigint | IntNum<Name>): Bool<Name> {
return SetHasSize(this, size);
}
add(elem: CoercibleToMap<SortToExprMap<ElemSort, Name>, Name>): SMTSet<Name, ElemSort> { add(elem: CoercibleToMap<SortToExprMap<ElemSort, Name>, Name>): SMTSet<Name, ElemSort> {
return SetAdd(this, elem); return SetAdd(this, elem);
} }
@ -3292,7 +3285,6 @@ export function createApi(Z3: Z3Core): Z3HighLevel {
SetUnion, SetUnion,
SetIntersect, SetIntersect,
SetDifference, SetDifference,
SetHasSize,
SetAdd, SetAdd,
SetDel, SetDel,
SetComplement, SetComplement,
@ -3317,6 +3309,6 @@ export function createApi(Z3: Z3Core): Z3HighLevel {
setParam, setParam,
resetParams, resetParams,
Context: createContext, Context: createContext as ContextCtor,
}; };
} }

View file

@ -125,6 +125,7 @@ export type CheckSatResult = 'sat' | 'unsat' | 'unknown';
/** @hidden */ /** @hidden */
export interface ContextCtor { export interface ContextCtor {
<Name extends string>(name: Name, options?: Record<string, any>): Context<Name>; <Name extends string>(name: Name, options?: Record<string, any>): Context<Name>;
new <Name extends string>(name: Name, options?: Record<string, any>): Context<Name>;
} }
export interface Context<Name extends string = 'main'> { export interface Context<Name extends string = 'main'> {
@ -629,9 +630,6 @@ export interface Context<Name extends string = 'main'> {
/** @category Operations */ /** @category Operations */
SetDifference<ElemSort extends AnySort<Name>>(a: SMTSet<Name, ElemSort>, b: SMTSet<Name, ElemSort>): SMTSet<Name, ElemSort>; SetDifference<ElemSort extends AnySort<Name>>(a: SMTSet<Name, ElemSort>, b: SMTSet<Name, ElemSort>): SMTSet<Name, ElemSort>;
/** @category Operations */
SetHasSize<ElemSort extends AnySort<Name>>(set: SMTSet<Name, ElemSort>, size: bigint | number | string | IntNum<Name>): Bool<Name>;
/** @category Operations */ /** @category Operations */
SetAdd<ElemSort extends AnySort<Name>>(set: SMTSet<Name, ElemSort>, elem: CoercibleToMap<SortToExprMap<ElemSort, Name>, Name>): SMTSet<Name, ElemSort>; SetAdd<ElemSort extends AnySort<Name>>(set: SMTSet<Name, ElemSort>, elem: CoercibleToMap<SortToExprMap<ElemSort, Name>, Name>): SMTSet<Name, ElemSort>;
@ -1649,7 +1647,6 @@ export interface SMTSet<Name extends string = 'main', ElemSort extends AnySort<N
intersect(...args: SMTSet<Name, ElemSort>[]): SMTSet<Name, ElemSort>; intersect(...args: SMTSet<Name, ElemSort>[]): SMTSet<Name, ElemSort>;
diff(b: SMTSet<Name, ElemSort>): SMTSet<Name, ElemSort>; diff(b: SMTSet<Name, ElemSort>): SMTSet<Name, ElemSort>;
hasSize(size: bigint | number | string | IntNum<Name>): Bool<Name>;
add(elem: CoercibleToMap<SortToExprMap<ElemSort, Name>, Name>): SMTSet<Name, ElemSort>; add(elem: CoercibleToMap<SortToExprMap<ElemSort, Name>, Name>): SMTSet<Name, ElemSort>;
del(elem: CoercibleToMap<SortToExprMap<ElemSort, Name>, Name>): SMTSet<Name, ElemSort>; del(elem: CoercibleToMap<SortToExprMap<ElemSort, Name>, Name>): SMTSet<Name, ElemSort>;

View file

@ -11,7 +11,7 @@ export * from './low-level/types.__GENERATED__';
* The main entry point to the Z3 API * The main entry point to the Z3 API
* *
* ```typescript * ```typescript
* import { init, sat } from 'z3-solver'; * import { init } from 'z3-solver';
* *
* const { Context } = await init(); * const { Context } = await init();
* const { Solver, Int } = new Context('main'); * const { Solver, Int } = new Context('main');
@ -22,7 +22,7 @@ export * from './low-level/types.__GENERATED__';
* const solver = new Solver(); * const solver = new Solver();
* solver.add(x.add(2).le(y.sub(10))); // x + 2 <= y - 10 * solver.add(x.add(2).le(y.sub(10))); // x + 2 <= y - 10
* *
* if (await solver.check() !== sat) { * if (await solver.check() !== 'sat') {
* throw new Error("couldn't find a solution") * throw new Error("couldn't find a solution")
* } * }
* const model = solver.model(); * const model = solver.model();

View file

@ -15,7 +15,7 @@ type context = Z3native.context
module Log = module Log =
struct struct
let open_ filename = let open_ filename =
lbool_of_int (Z3native.open_log filename) = L_TRUE (Z3native.open_log filename)
let close = Z3native.close_log let close = Z3native.close_log
let append = Z3native.append_log let append = Z3native.append_log
end end

View file

@ -70,13 +70,32 @@ else()
endif() endif()
# Link libz3 into the python directory so bindings work out of the box # Link libz3 into the python directory so bindings work out of the box
add_custom_command(OUTPUT "${z3py_bindings_build_dest}/libz3${CMAKE_SHARED_MODULE_SUFFIX}" # Handle both built libz3 and pre-installed libz3
COMMAND "${CMAKE_COMMAND}" "-E" "${LINK_COMMAND}" if (TARGET libz3)
"${PROJECT_BINARY_DIR}/libz3${CMAKE_SHARED_MODULE_SUFFIX}" # Get the libz3 location - handle both regular and imported targets
"${z3py_bindings_build_dest}/libz3${CMAKE_SHARED_MODULE_SUFFIX}" get_target_property(LIBZ3_IS_IMPORTED libz3 IMPORTED)
DEPENDS libz3 if (LIBZ3_IS_IMPORTED)
COMMENT "Linking libz3 into python directory" # For imported targets, get the IMPORTED_LOCATION
) get_target_property(LIBZ3_SOURCE_PATH libz3 IMPORTED_LOCATION)
# No dependency on libz3 target since it's pre-built
set(LIBZ3_DEPENDS "")
else()
# For regular targets, use the build output location
set(LIBZ3_SOURCE_PATH "${PROJECT_BINARY_DIR}/libz3${CMAKE_SHARED_MODULE_SUFFIX}")
set(LIBZ3_DEPENDS libz3)
endif()
add_custom_command(OUTPUT "${z3py_bindings_build_dest}/libz3${CMAKE_SHARED_MODULE_SUFFIX}"
COMMAND "${CMAKE_COMMAND}" "-E" "${LINK_COMMAND}"
"${LIBZ3_SOURCE_PATH}"
"${z3py_bindings_build_dest}/libz3${CMAKE_SHARED_MODULE_SUFFIX}"
DEPENDS ${LIBZ3_DEPENDS}
COMMENT "Linking libz3 into python directory"
)
else()
message(FATAL_ERROR "libz3 target not found. Cannot build Python bindings.")
endif()
# Convenient top-level target # Convenient top-level target
add_custom_target(build_z3_python_bindings add_custom_target(build_z3_python_bindings

View file

@ -24,6 +24,7 @@ ROOT_DIR = os.path.abspath(os.path.dirname(__file__))
SRC_DIR_LOCAL = os.path.join(ROOT_DIR, 'core') SRC_DIR_LOCAL = os.path.join(ROOT_DIR, 'core')
SRC_DIR_REPO = os.path.join(ROOT_DIR, '..', '..', '..') SRC_DIR_REPO = os.path.join(ROOT_DIR, '..', '..', '..')
SRC_DIR = SRC_DIR_LOCAL if os.path.exists(SRC_DIR_LOCAL) else SRC_DIR_REPO SRC_DIR = SRC_DIR_LOCAL if os.path.exists(SRC_DIR_LOCAL) else SRC_DIR_REPO
BUILD_DIR = build_env.get('Z3BUILD', 'build')
IS_SINGLE_THREADED = False IS_SINGLE_THREADED = False
ENABLE_LTO = True ENABLE_LTO = True
@ -34,7 +35,7 @@ IS_PYODIDE = 'PYODIDE_ROOT' in os.environ and os.environ.get('_PYTHON_HOST_PLATF
# determine where binaries are # determine where binaries are
RELEASE_DIR = os.environ.get('PACKAGE_FROM_RELEASE', None) RELEASE_DIR = os.environ.get('PACKAGE_FROM_RELEASE', None)
if RELEASE_DIR is None: if RELEASE_DIR is None:
BUILD_DIR = os.path.join(SRC_DIR, 'build') # implicit in configure script BUILD_DIR = os.path.join(SRC_DIR, BUILD_DIR) # implicit in configure script
HEADER_DIRS = [os.path.join(SRC_DIR, 'src', 'api'), os.path.join(SRC_DIR, 'src', 'api', 'c++')] HEADER_DIRS = [os.path.join(SRC_DIR, 'src', 'api'), os.path.join(SRC_DIR, 'src', 'api', 'c++')]
RELEASE_METADATA = None RELEASE_METADATA = None
if IS_PYODIDE: if IS_PYODIDE:

View file

@ -5010,13 +5010,6 @@ def Ext(a, b):
_z3_assert(is_array_sort(a) and (is_array(b) or b.is_lambda()), "arguments must be arrays") _z3_assert(is_array_sort(a) and (is_array(b) or b.is_lambda()), "arguments must be arrays")
return _to_expr_ref(Z3_mk_array_ext(ctx.ref(), a.as_ast(), b.as_ast()), ctx) return _to_expr_ref(Z3_mk_array_ext(ctx.ref(), a.as_ast(), b.as_ast()), ctx)
def SetHasSize(a, k):
ctx = a.ctx
k = _py2expr(k, ctx)
return _to_expr_ref(Z3_mk_set_has_size(ctx.ref(), a.as_ast(), k.as_ast()), ctx)
def is_select(a): def is_select(a):
"""Return `True` if `a` is a Z3 array select application. """Return `True` if `a` is a Z3 array select application.
@ -10039,7 +10032,7 @@ class FPNumRef(FPRef):
""" """
def sign(self): def sign(self):
num = (ctypes.c_int)() num = ctypes.c_bool()
nsign = Z3_fpa_get_numeral_sign(self.ctx.ref(), self.as_ast(), byref(num)) nsign = Z3_fpa_get_numeral_sign(self.ctx.ref(), self.as_ast(), byref(num))
if nsign is False: if nsign is False:
raise Z3Exception("error retrieving the sign of a numeral.") raise Z3Exception("error retrieving the sign of a numeral.")

View file

@ -831,7 +831,7 @@ class Formatter:
else: else:
_z3_assert(z3.is_fp_value(a), "expecting FP num ast") _z3_assert(z3.is_fp_value(a), "expecting FP num ast")
r = [] r = []
sgn = c_int(0) sgn = ctypes.c_bool()
sgnb = Z3_fpa_get_numeral_sign(a.ctx_ref(), a.ast, byref(sgn)) sgnb = Z3_fpa_get_numeral_sign(a.ctx_ref(), a.ast, byref(sgn))
exp = Z3_fpa_get_numeral_exponent_string(a.ctx_ref(), a.ast, False) exp = Z3_fpa_get_numeral_exponent_string(a.ctx_ref(), a.ast, False)
sig = Z3_fpa_get_numeral_significand_string(a.ctx_ref(), a.ast) sig = Z3_fpa_get_numeral_significand_string(a.ctx_ref(), a.ast)
@ -861,7 +861,7 @@ class Formatter:
else: else:
_z3_assert(z3.is_fp_value(a), "expecting FP num ast") _z3_assert(z3.is_fp_value(a), "expecting FP num ast")
r = [] r = []
sgn = (ctypes.c_int)(0) sgn = ctypes.c_bool()
sgnb = Z3_fpa_get_numeral_sign(a.ctx_ref(), a.ast, byref(sgn)) sgnb = Z3_fpa_get_numeral_sign(a.ctx_ref(), a.ast, byref(sgn))
exp = Z3_fpa_get_numeral_exponent_string(a.ctx_ref(), a.ast, False) exp = Z3_fpa_get_numeral_exponent_string(a.ctx_ref(), a.ast, False)
sig = Z3_fpa_get_numeral_significand_string(a.ctx_ref(), a.ast) sig = Z3_fpa_get_numeral_significand_string(a.ctx_ref(), a.ast)

View file

@ -1037,8 +1037,6 @@ typedef enum {
Z3_OP_SET_SUBSET, Z3_OP_SET_SUBSET,
Z3_OP_AS_ARRAY, Z3_OP_AS_ARRAY,
Z3_OP_ARRAY_EXT, Z3_OP_ARRAY_EXT,
Z3_OP_SET_HAS_SIZE,
Z3_OP_SET_CARD,
// Bit-vectors // Bit-vectors
Z3_OP_BNUM = 0x400, Z3_OP_BNUM = 0x400,
@ -3316,12 +3314,6 @@ extern "C" {
*/ */
Z3_ast Z3_API Z3_mk_as_array(Z3_context c, Z3_func_decl f); Z3_ast Z3_API Z3_mk_as_array(Z3_context c, Z3_func_decl f);
/**
\brief Create predicate that holds if Boolean array \c set has \c k elements set to true.
def_API('Z3_mk_set_has_size', AST, (_in(CONTEXT), _in(AST), _in(AST)))
*/
Z3_ast Z3_API Z3_mk_set_has_size(Z3_context c, Z3_ast set, Z3_ast k);
/**@}*/ /**@}*/
@ -5877,7 +5869,7 @@ extern "C" {
\sa Z3_append_log \sa Z3_append_log
\sa Z3_close_log \sa Z3_close_log
extra_API('Z3_open_log', INT, (_in(STRING),)) extra_API('Z3_open_log', BOOL, (_in(STRING),))
*/ */
bool Z3_API Z3_open_log(Z3_string filename); bool Z3_API Z3_open_log(Z3_string filename);
@ -7140,7 +7132,7 @@ extern "C" {
\brief retrieve the decision depth of Boolean literals (variables or their negations). \brief retrieve the decision depth of Boolean literals (variables or their negations).
Assumes a check-sat call and no other calls (to extract models) have been invoked. Assumes a check-sat call and no other calls (to extract models) have been invoked.
def_API('Z3_solver_get_levels', VOID, (_in(CONTEXT), _in(SOLVER), _in(AST_VECTOR), _in(UINT), _in_array(3, UINT))) def_API('Z3_solver_get_levels', VOID, (_in(CONTEXT), _in(SOLVER), _in(AST_VECTOR), _in(UINT), _out_array(3, UINT)))
*/ */
void Z3_API Z3_solver_get_levels(Z3_context c, Z3_solver s, Z3_ast_vector literals, unsigned sz, unsigned levels[]); void Z3_API Z3_solver_get_levels(Z3_context c, Z3_solver s, Z3_ast_vector literals, unsigned sz, unsigned levels[]);

View file

@ -1089,6 +1089,22 @@ extern "C" {
*/ */
unsigned Z3_API Z3_fpa_get_sbits(Z3_context c, Z3_sort s); unsigned Z3_API Z3_fpa_get_sbits(Z3_context c, Z3_sort s);
/**
\brief Checks whether a given ast is a floating-point numeral.
\param c logical context
\param t an ast
\sa Z3_fpa_is_numeral_nan
\sa Z3_fpa_is_numeral_inf
\sa Z3_fpa_is_numeral_normal
\sa Z3_fpa_is_numeral_subnormal
\sa Z3_fpa_is_numeral_zero
def_API('Z3_fpa_is_numeral', BOOL, (_in(CONTEXT), _in(AST)))
*/
bool Z3_API Z3_fpa_is_numeral(Z3_context c, Z3_ast t);
/** /**
\brief Checks whether a given floating-point numeral is a NaN. \brief Checks whether a given floating-point numeral is a NaN.
@ -1220,12 +1236,12 @@ extern "C" {
\param sgn the retrieved sign \param sgn the retrieved sign
\returns true if \c t corresponds to a floating point numeral, otherwise invokes exception handler or returns false \returns true if \c t corresponds to a floating point numeral, otherwise invokes exception handler or returns false
Remarks: sets \c sgn to 0 if `t' is positive and to 1 otherwise, except for Remarks: sets \c sgn to \c false if `t' is positive and to \c true otherwise, except for
NaN, which is an invalid argument. NaN, which is an invalid argument.
def_API('Z3_fpa_get_numeral_sign', BOOL, (_in(CONTEXT), _in(AST), _out(INT))) def_API('Z3_fpa_get_numeral_sign', BOOL, (_in(CONTEXT), _in(AST), _out(BOOL)))
*/ */
bool Z3_API Z3_fpa_get_numeral_sign(Z3_context c, Z3_ast t, int * sgn); bool Z3_API Z3_fpa_get_numeral_sign(Z3_context c, Z3_ast t, bool * sgn);
/** /**
\brief Return the significand value of a floating-point numeral as a string. \brief Return the significand value of a floating-point numeral as a string.

View file

@ -379,6 +379,23 @@ extern "C" {
void* ctx, void* ctx,
Z3_model_eh model_eh); Z3_model_eh model_eh);
/**
\brief Copy an optimization context from a source to a target context.
This function allows translating an optimization context from one Z3_context
to another. This is useful when working with multiple contexts and needing to
transfer optimization problems between them.
\param c Source context containing the optimization context to translate
\param o The optimization context to translate from the source context
\param target Target context where the optimization context will be created
\return A new optimization context in the target context with the same state
def_API('Z3_optimize_translate', OPTIMIZE, (_in(CONTEXT), _in(OPTIMIZE), _in(CONTEXT)))
*/
Z3_optimize Z3_API Z3_optimize_translate(Z3_context c, Z3_optimize o, Z3_context target);
/**@}*/ /**@}*/
/**@}*/ /**@}*/

View file

@ -272,9 +272,9 @@ extern "C" {
\pre Z3_rcf_is_algebraic(ctx, a); \pre Z3_rcf_is_algebraic(ctx, a);
def_API('Z3_rcf_interval', INT, (_in(CONTEXT), _in(RCF_NUM), _out(INT), _out(INT), _out(RCF_NUM), _out(INT), _out(INT), _out(RCF_NUM))) def_API('Z3_rcf_interval', INT, (_in(CONTEXT), _in(RCF_NUM), _out(BOOL), _out(BOOL), _out(RCF_NUM), _out(BOOL), _out(BOOL), _out(RCF_NUM)))
*/ */
int Z3_API Z3_rcf_interval(Z3_context c, Z3_rcf_num a, int * lower_is_inf, int * lower_is_open, Z3_rcf_num * lower, int * upper_is_inf, int * upper_is_open, Z3_rcf_num * upper); int Z3_API Z3_rcf_interval(Z3_context c, Z3_rcf_num a, bool * lower_is_inf, bool * lower_is_open, Z3_rcf_num * lower, bool * upper_is_inf, bool * upper_is_open, Z3_rcf_num * upper);
/** /**
\brief Return the number of sign conditions of an algebraic number. \brief Return the number of sign conditions of an algebraic number.

View file

@ -662,6 +662,11 @@ struct z3_replayer::imp {
return v.data(); return v.data();
} }
bool * get_bool_addr(unsigned pos) {
check_arg(pos, INT64);
return reinterpret_cast<bool*>(&(m_args[pos].m_int));
}
int * get_int_addr(unsigned pos) { int * get_int_addr(unsigned pos) {
check_arg(pos, INT64); check_arg(pos, INT64);
return reinterpret_cast<int*>(&(m_args[pos].m_int)); return reinterpret_cast<int*>(&(m_args[pos].m_int));
@ -790,6 +795,10 @@ void ** z3_replayer::get_obj_array(unsigned pos) const {
return m_imp->get_obj_array(pos); return m_imp->get_obj_array(pos);
} }
bool * z3_replayer::get_bool_addr(unsigned pos) {
return m_imp->get_bool_addr(pos);
}
int * z3_replayer::get_int_addr(unsigned pos) { int * z3_replayer::get_int_addr(unsigned pos) {
return m_imp->get_int_addr(pos); return m_imp->get_int_addr(pos);
} }

View file

@ -53,6 +53,7 @@ public:
Z3_symbol * get_symbol_array(unsigned pos) const; Z3_symbol * get_symbol_array(unsigned pos) const;
void ** get_obj_array(unsigned pos) const; void ** get_obj_array(unsigned pos) const;
bool * get_bool_addr(unsigned pos);
int * get_int_addr(unsigned pos); int * get_int_addr(unsigned pos);
int64_t * get_int64_addr(unsigned pos); int64_t * get_int64_addr(unsigned pos);
unsigned * get_uint_addr(unsigned pos); unsigned * get_uint_addr(unsigned pos);

View file

@ -188,8 +188,12 @@ void arith_decl_plugin::set_manager(ast_manager * m, family_id id) {
m_to_real_decl = m->mk_func_decl(symbol("to_real"), i, r, func_decl_info(id, OP_TO_REAL)); m_to_real_decl = m->mk_func_decl(symbol("to_real"), i, r, func_decl_info(id, OP_TO_REAL));
m->inc_ref(m_to_real_decl); m->inc_ref(m_to_real_decl);
m_r_to_real_decl = m->mk_func_decl(symbol("to_real"), r, r, func_decl_info(id, OP_TO_REAL));
m->inc_ref(m_r_to_real_decl);
m_to_int_decl = m->mk_func_decl(symbol("to_int"), r, i, func_decl_info(id, OP_TO_INT)); m_to_int_decl = m->mk_func_decl(symbol("to_int"), r, i, func_decl_info(id, OP_TO_INT));
m->inc_ref(m_to_int_decl); m->inc_ref(m_to_int_decl);
m_i_to_int_decl = m->mk_func_decl(symbol("to_int"), i, i, func_decl_info(id, OP_TO_INT));
m->inc_ref(m_i_to_int_decl);
m_is_int_decl = m->mk_func_decl(symbol("is_int"), r, m->mk_bool_sort(), func_decl_info(id, OP_IS_INT)); m_is_int_decl = m->mk_func_decl(symbol("is_int"), r, m->mk_bool_sort(), func_decl_info(id, OP_IS_INT));
m->inc_ref(m_is_int_decl); m->inc_ref(m_is_int_decl);
@ -311,6 +315,8 @@ void arith_decl_plugin::finalize() {
DEC_REF(m_i_rem_decl); DEC_REF(m_i_rem_decl);
DEC_REF(m_to_real_decl); DEC_REF(m_to_real_decl);
DEC_REF(m_to_int_decl); DEC_REF(m_to_int_decl);
DEC_REF(m_r_to_real_decl);
DEC_REF(m_i_to_int_decl);
DEC_REF(m_is_int_decl); DEC_REF(m_is_int_decl);
DEC_REF(m_i_power_decl); DEC_REF(m_i_power_decl);
DEC_REF(m_r_power_decl); DEC_REF(m_r_power_decl);
@ -368,8 +374,8 @@ inline func_decl * arith_decl_plugin::mk_func_decl(decl_kind k, bool is_real) {
return m_manager->mk_func_decl(symbol("^0"), m_real_decl, m_real_decl, m_real_decl, func_decl_info(m_family_id, OP_POWER0)); return m_manager->mk_func_decl(symbol("^0"), m_real_decl, m_real_decl, m_real_decl, func_decl_info(m_family_id, OP_POWER0));
} }
return m_manager->mk_func_decl(symbol("^0"), m_int_decl, m_int_decl, m_real_decl, func_decl_info(m_family_id, OP_POWER0)); return m_manager->mk_func_decl(symbol("^0"), m_int_decl, m_int_decl, m_real_decl, func_decl_info(m_family_id, OP_POWER0));
case OP_TO_REAL: return m_to_real_decl; case OP_TO_REAL: return is_real ? m_r_to_real_decl : m_to_real_decl;
case OP_TO_INT: return m_to_int_decl; case OP_TO_INT: return is_real ? m_to_int_decl : m_i_to_int_decl;
case OP_IS_INT: return m_is_int_decl; case OP_IS_INT: return m_is_int_decl;
case OP_POWER: return is_real ? m_r_power_decl : m_i_power_decl; case OP_POWER: return is_real ? m_r_power_decl : m_i_power_decl;
case OP_ABS: return is_real ? m_r_abs_decl : m_i_abs_decl; case OP_ABS: return is_real ? m_r_abs_decl : m_i_abs_decl;

View file

@ -120,11 +120,13 @@ protected:
func_decl * m_i_mod_decl; func_decl * m_i_mod_decl;
func_decl * m_i_rem_decl; func_decl * m_i_rem_decl;
func_decl * m_to_real_decl; func_decl * m_to_real_decl = nullptr;
func_decl * m_to_int_decl; func_decl * m_to_int_decl = nullptr;
func_decl * m_is_int_decl; func_decl * m_r_to_real_decl = nullptr;
func_decl * m_r_power_decl; func_decl * m_i_to_int_decl = nullptr;
func_decl * m_i_power_decl; func_decl * m_is_int_decl = nullptr;
func_decl * m_r_power_decl = nullptr;
func_decl * m_i_power_decl = nullptr;
func_decl * m_r_abs_decl; func_decl * m_r_abs_decl;
func_decl * m_i_abs_decl; func_decl * m_i_abs_decl;

View file

@ -35,9 +35,7 @@ array_decl_plugin::array_decl_plugin():
m_set_complement_sym("complement"), m_set_complement_sym("complement"),
m_set_subset_sym("subset"), m_set_subset_sym("subset"),
m_array_ext_sym("array-ext"), m_array_ext_sym("array-ext"),
m_as_array_sym("as-array"), m_as_array_sym("as-array") {
m_set_has_size_sym("set-has-size"),
m_set_card_sym("card") {
} }
#define ARRAY_SORT_STR "Array" #define ARRAY_SORT_STR "Array"
@ -442,40 +440,6 @@ func_decl * array_decl_plugin::mk_set_subset(unsigned arity, sort * const * doma
func_decl_info(m_family_id, OP_SET_SUBSET)); func_decl_info(m_family_id, OP_SET_SUBSET));
} }
func_decl * array_decl_plugin::mk_set_card(unsigned arity, sort * const* domain) {
if (arity != 1) {
m_manager->raise_exception("card takes only one argument");
return nullptr;
}
arith_util arith(*m_manager);
if (!is_array_sort(domain[0]) || !m_manager->is_bool(get_array_range(domain[0]))) {
m_manager->raise_exception("card expects an array of Booleans");
}
sort * int_sort = arith.mk_int();
return m_manager->mk_func_decl(m_set_card_sym, arity, domain, int_sort,
func_decl_info(m_family_id, OP_SET_CARD));
}
func_decl * array_decl_plugin::mk_set_has_size(unsigned arity, sort * const* domain) {
if (arity != 2) {
m_manager->raise_exception("set-has-size takes two arguments");
return nullptr;
}
m_manager->raise_exception("set-has-size is not supported");
// domain[0] is a Boolean array,
// domain[1] is Int
arith_util arith(*m_manager);
if (!arith.is_int(domain[1])) {
m_manager->raise_exception("set-has-size expects second argument to be an integer");
}
if (!is_array_sort(domain[0]) || !m_manager->is_bool(get_array_range(domain[0]))) {
m_manager->raise_exception("set-has-size expects first argument to be an array of Booleans");
}
sort * bool_sort = m_manager->mk_bool_sort();
return m_manager->mk_func_decl(m_set_has_size_sym, arity, domain, bool_sort,
func_decl_info(m_family_id, OP_SET_HAS_SIZE));
}
func_decl * array_decl_plugin::mk_as_array(func_decl * f) { func_decl * array_decl_plugin::mk_as_array(func_decl * f) {
vector<parameter> parameters; vector<parameter> parameters;
@ -541,10 +505,6 @@ func_decl * array_decl_plugin::mk_func_decl(decl_kind k, unsigned num_parameters
return mk_set_complement(arity, domain); return mk_set_complement(arity, domain);
case OP_SET_SUBSET: case OP_SET_SUBSET:
return mk_set_subset(arity, domain); return mk_set_subset(arity, domain);
case OP_SET_HAS_SIZE:
return mk_set_has_size(arity, domain);
case OP_SET_CARD:
return mk_set_card(arity, domain);
case OP_AS_ARRAY: { case OP_AS_ARRAY: {
if (num_parameters != 1 || if (num_parameters != 1 ||
!parameters[0].is_ast() || !parameters[0].is_ast() ||

View file

@ -62,8 +62,6 @@ enum array_op_kind {
OP_SET_DIFFERENCE, OP_SET_DIFFERENCE,
OP_SET_COMPLEMENT, OP_SET_COMPLEMENT,
OP_SET_SUBSET, OP_SET_SUBSET,
OP_SET_HAS_SIZE,
OP_SET_CARD,
OP_AS_ARRAY, // used for model construction OP_AS_ARRAY, // used for model construction
LAST_ARRAY_OP LAST_ARRAY_OP
}; };
@ -81,8 +79,6 @@ class array_decl_plugin : public decl_plugin {
symbol m_set_subset_sym; symbol m_set_subset_sym;
symbol m_array_ext_sym; symbol m_array_ext_sym;
symbol m_as_array_sym; symbol m_as_array_sym;
symbol m_set_has_size_sym;
symbol m_set_card_sym;
bool check_set_arguments(unsigned arity, sort * const * domain); bool check_set_arguments(unsigned arity, sort * const * domain);
@ -110,10 +106,6 @@ class array_decl_plugin : public decl_plugin {
func_decl * mk_as_array(func_decl * f); func_decl * mk_as_array(func_decl * f);
func_decl* mk_set_has_size(unsigned arity, sort * const* domain);
func_decl* mk_set_card(unsigned arity, sort * const* domain);
bool is_array_sort(sort* s) const; bool is_array_sort(sort* s) const;
public: public:
array_decl_plugin(); array_decl_plugin();
@ -173,8 +165,6 @@ public:
bool is_complement(expr* n) const { return is_app_of(n, m_fid, OP_SET_COMPLEMENT); } bool is_complement(expr* n) const { return is_app_of(n, m_fid, OP_SET_COMPLEMENT); }
bool is_as_array(expr * n) const { return is_app_of(n, m_fid, OP_AS_ARRAY); } bool is_as_array(expr * n) const { return is_app_of(n, m_fid, OP_AS_ARRAY); }
bool is_as_array(expr * n, func_decl*& f) const { return is_as_array(n) && (f = get_as_array_func_decl(n), true); } bool is_as_array(expr * n, func_decl*& f) const { return is_as_array(n) && (f = get_as_array_func_decl(n), true); }
bool is_set_has_size(expr* e) const { return is_app_of(e, m_fid, OP_SET_HAS_SIZE); }
bool is_set_card(expr* e) const { return is_app_of(e, m_fid, OP_SET_CARD); }
bool is_select(func_decl* f) const { return is_decl_of(f, m_fid, OP_SELECT); } bool is_select(func_decl* f) const { return is_decl_of(f, m_fid, OP_SELECT); }
bool is_store(func_decl* f) const { return is_decl_of(f, m_fid, OP_STORE); } bool is_store(func_decl* f) const { return is_decl_of(f, m_fid, OP_STORE); }
bool is_const(func_decl* f) const { return is_decl_of(f, m_fid, OP_CONST_ARRAY); } bool is_const(func_decl* f) const { return is_decl_of(f, m_fid, OP_CONST_ARRAY); }
@ -182,8 +172,6 @@ public:
bool is_union(func_decl* f) const { return is_decl_of(f, m_fid, OP_SET_UNION); } bool is_union(func_decl* f) const { return is_decl_of(f, m_fid, OP_SET_UNION); }
bool is_intersect(func_decl* f) const { return is_decl_of(f, m_fid, OP_SET_INTERSECT); } bool is_intersect(func_decl* f) const { return is_decl_of(f, m_fid, OP_SET_INTERSECT); }
bool is_as_array(func_decl* f) const { return is_decl_of(f, m_fid, OP_AS_ARRAY); } bool is_as_array(func_decl* f) const { return is_decl_of(f, m_fid, OP_AS_ARRAY); }
bool is_set_has_size(func_decl* f) const { return is_decl_of(f, m_fid, OP_SET_HAS_SIZE); }
bool is_set_card(func_decl* f) const { return is_decl_of(f, m_fid, OP_SET_CARD); }
bool is_default(func_decl* f) const { return is_decl_of(f, m_fid, OP_ARRAY_DEFAULT); } bool is_default(func_decl* f) const { return is_decl_of(f, m_fid, OP_ARRAY_DEFAULT); }
bool is_default(expr* n) const { return is_app_of(n, m_fid, OP_ARRAY_DEFAULT); } bool is_default(expr* n) const { return is_app_of(n, m_fid, OP_ARRAY_DEFAULT); }
bool is_subset(expr const* n) const { return is_app_of(n, m_fid, OP_SET_SUBSET); } bool is_subset(expr const* n) const { return is_app_of(n, m_fid, OP_SET_SUBSET); }
@ -307,14 +295,6 @@ public:
return m_manager.mk_app(m_fid, OP_SET_UNION, s1, s2); return m_manager.mk_app(m_fid, OP_SET_UNION, s1, s2);
} }
app* mk_has_size(expr* set, expr* n) {
return m_manager.mk_app(m_fid, OP_SET_HAS_SIZE, set, n);
}
app* mk_card(expr* set) {
return m_manager.mk_app(m_fid, OP_SET_CARD, set);
}
func_decl * mk_array_ext(sort* domain, unsigned i); func_decl * mk_array_ext(sort* domain, unsigned i);
sort * mk_array_sort(sort* dom, sort* range) { return mk_array_sort(1, &dom, range); } sort * mk_array_sort(sort* dom, sort* range) { return mk_array_sort(1, &dom, range); }

View file

@ -1057,7 +1057,6 @@ namespace euf {
SASSERT(is_correct_ref_count(dst, dst_counts)); SASSERT(is_correct_ref_count(dst, dst_counts));
SASSERT(&src_r.m_nodes != &dst); SASSERT(&src_r.m_nodes != &dst);
unsigned sz = dst.size(), j = 0; unsigned sz = dst.size(), j = 0;
bool change = false;
for (unsigned i = 0; i < sz; ++i) { for (unsigned i = 0; i < sz; ++i) {
auto* n = dst[i]; auto* n = dst[i];
unsigned id = n->id(); unsigned id = n->id();

View file

@ -59,7 +59,6 @@ namespace euf {
expr* e = n->get_expr(), * x, * y; expr* e = n->get_expr(), * x, * y;
// x - y = x + (* -1 y) // x - y = x + (* -1 y)
if (a.is_sub(e, x, y)) { if (a.is_sub(e, x, y)) {
auto& m = g.get_manager();
auto e1 = a.mk_numeral(rational(-1), a.is_int(x)); auto e1 = a.mk_numeral(rational(-1), a.is_int(x));
auto n1 = g.find(e1) ? g.find(e1) : g.mk(e1, 0, 0, nullptr); auto n1 = g.find(e1) ? g.find(e1) : g.mk(e1, 0, 0, nullptr);
auto e2 = a.mk_mul(e1, y); auto e2 = a.mk_mul(e1, y);

View file

@ -293,8 +293,7 @@ namespace euf {
// v - offset |-> t // v - offset |-> t
if (is_meta_var(p, wi.pat_offset()) && is_closed(t, 0, wi.term_offset())) { if (is_meta_var(p, wi.pat_offset()) && is_closed(t, 0, wi.term_offset())) {
auto v = to_var(p); auto v = to_var(p);
auto idx = v->get_idx() - wi.pat_offset(); SASSERT(!m_subst.get(v->get_idx() - wi.pat_offset())); // reduce ensures meta variables are not in substitutions
SASSERT(!m_subst.get(idx)); // reduce ensures meta variables are not in substitutions
add_binding(v, wi.pat_offset(), t); add_binding(v, wi.pat_offset(), t);
wi.set_done(); wi.set_done();
return true; return true;

View file

@ -246,39 +246,39 @@ void fpa2bv_converter::mk_var(unsigned base_inx, sort * srt, expr_ref & result)
result = m_util.mk_fp(sgn, e, s); result = m_util.mk_fp(sgn, e, s);
} }
expr_ref fpa2bv_converter::extra_quantify(expr * e) expr_ref fpa2bv_converter::extra_quantify(expr * e) {
{
used_vars uv; used_vars uv;
unsigned nv;
ptr_buffer<sort> new_decl_sorts;
sbuffer<symbol> new_decl_names;
expr_ref_buffer subst_map(m);
uv(e); uv(e);
nv = uv.get_num_vars(); if (uv.get_num_vars() == 0)
subst_map.resize(uv.get_max_found_var_idx_plus_1());
if (nv == 0)
return expr_ref(e, m); return expr_ref(e, m);
for (unsigned i = 0; i < nv; i++) ptr_vector<sort> new_decl_sorts;
{ svector<symbol> new_decl_names;
expr_ref_vector subst_map(m);
unsigned nv = uv.get_max_found_var_idx_plus_1();
subst_map.resize(nv);
unsigned j = 0;
for (unsigned i = 0; i < nv; i++) {
if (uv.contains(i)) { if (uv.contains(i)) {
TRACE(fpa2bv, tout << "uv[" << i << "] = " << mk_ismt2_pp(uv.get(i), m) << std::endl; ); TRACE(fpa2bv, tout << "uv[" << i << "] = " << mk_ismt2_pp(uv.get(i), m) << std::endl; );
sort * s = uv.get(i); sort * s = uv.get(i);
var * v = m.mk_var(i, s); var * v = m.mk_var(j, s);
new_decl_sorts.push_back(s); new_decl_sorts.push_back(s);
new_decl_names.push_back(symbol(i)); new_decl_names.push_back(symbol(j));
subst_map.set(i, v); subst_map.set(i, v);
++j;
} }
} }
SASSERT(!new_decl_sorts.empty());
expr_ref res(m);
var_subst vsubst(m); var_subst vsubst(m, false); // use reverse order: var i is at position i.
res = vsubst.operator()(e, nv, subst_map.data()); auto res = vsubst(e, subst_map);
TRACE(fpa2bv, tout << "subst'd = " << mk_ismt2_pp(res, m) << std::endl; ); TRACE(fpa2bv, tout << "subst'd = " << mk_ismt2_pp(e, m) << "\n->\n" << mk_ismt2_pp(res, m) << "\n");
res = m.mk_forall(nv, new_decl_sorts.data(), new_decl_names.data(), res); new_decl_sorts.reverse(); // var 0 is at position num_decl_sorts.size() - 1, ...
new_decl_names.reverse();
res = m.mk_forall(new_decl_sorts.size(), new_decl_sorts.data(), new_decl_names.data(), res);
return res; return res;
} }

View file

@ -1829,6 +1829,10 @@ br_status arith_rewriter::mk_power_core(expr * arg1, expr * arg2, expr_ref & res
br_status arith_rewriter::mk_to_int_core(expr * arg, expr_ref & result) { br_status arith_rewriter::mk_to_int_core(expr * arg, expr_ref & result) {
numeral a; numeral a;
expr* x = nullptr; expr* x = nullptr;
if (m_util.is_int(arg)) {
result = arg;
return BR_DONE;
}
if (m_util.convert_int_numerals_to_real()) if (m_util.convert_int_numerals_to_real())
return BR_FAILED; return BR_FAILED;
@ -1837,7 +1841,7 @@ br_status arith_rewriter::mk_to_int_core(expr * arg, expr_ref & result) {
return BR_DONE; return BR_DONE;
} }
if (m_util.is_to_real(arg, x)) { if (m_util.is_to_real(arg, x) && m_util.is_int(x)) {
result = x; result = x;
return BR_DONE; return BR_DONE;
} }
@ -1885,6 +1889,10 @@ br_status arith_rewriter::mk_to_real_core(expr * arg, expr_ref & result) {
result = m_util.mk_numeral(a, false); result = m_util.mk_numeral(a, false);
return BR_DONE; return BR_DONE;
} }
if (m_util.is_real(arg)) {
result = arg;
return BR_DONE;
}
// push to_real over OP_ADD, OP_MUL // push to_real over OP_ADD, OP_MUL
if (m_push_to_real) { if (m_push_to_real) {
if (m_util.is_add(arg) || m_util.is_mul(arg)) { if (m_util.is_add(arg) || m_util.is_mul(arg)) {
@ -1909,7 +1917,7 @@ br_status arith_rewriter::mk_is_int(expr * arg, expr_ref & result) {
return BR_DONE; return BR_DONE;
} }
if (m_util.is_to_real(arg)) { if (m_util.is_to_real(arg) && m_util.is_int(to_app(arg)->get_arg(0))) {
result = m.mk_true(); result = m.mk_true();
return BR_DONE; return BR_DONE;
} }

View file

@ -63,7 +63,7 @@ class bool_rewriter {
bool m_elim_ite; bool m_elim_ite;
ptr_vector<expr> m_todo1, m_todo2; ptr_vector<expr> m_todo1, m_todo2;
unsigned_vector m_counts1, m_counts2; unsigned_vector m_counts1, m_counts2;
expr_fast_mark1 m_marked; expr_mark m_marked;
br_status mk_flat_and_core(unsigned num_args, expr * const * args, expr_ref & result); br_status mk_flat_and_core(unsigned num_args, expr * const * args, expr_ref & result);
br_status mk_flat_or_core(unsigned num_args, expr * const * args, expr_ref & result); br_status mk_flat_or_core(unsigned num_args, expr * const * args, expr_ref & result);

View file

@ -34,6 +34,9 @@ br_status recfun_rewriter::mk_app_core(func_decl * f, unsigned num_args, expr *
for (unsigned i = 0; i < num_args; ++i) for (unsigned i = 0; i < num_args; ++i)
if (!m.is_value(args[i])) if (!m.is_value(args[i]))
safe_to_subst = false; safe_to_subst = false;
for (auto t : subterms::all(expr_ref(r, m)))
if (is_uninterp(t))
return BR_FAILED;
// check if there is an argument that is a constructor // check if there is an argument that is a constructor
// such that the recursive function can be partially evaluated. // such that the recursive function can be partially evaluated.

View file

@ -1088,7 +1088,7 @@ namespace euf {
verbose_stream() << mk_pp(s->get_expr(), m) << "\n"; verbose_stream() << mk_pp(s->get_expr(), m) << "\n";
} }
#endif #endif
auto n = m_egraph.find(q); // auto n = m_egraph.find(q);
#if 0 #if 0
verbose_stream() << "class of " << mk_pp(q, m) << "\n"; verbose_stream() << "class of " << mk_pp(q, m) << "\n";
for (auto s : euf::enode_class(n)) { for (auto s : euf::enode_class(n)) {

View file

@ -2588,6 +2588,8 @@ namespace sls {
template<typename num_t> template<typename num_t>
void arith_base<num_t>::invariant() { void arith_base<num_t>::invariant() {
if (m.limit().is_canceled())
return;
for (unsigned v = 0; v < ctx.num_bool_vars(); ++v) { for (unsigned v = 0; v < ctx.num_bool_vars(); ++v) {
auto ineq = get_ineq(v); auto ineq = get_ineq(v);
if (ineq) if (ineq)
@ -2622,6 +2624,8 @@ namespace sls {
}; };
for (var_t v = 0; v < m_vars.size(); ++v) { for (var_t v = 0; v < m_vars.size(); ++v) {
if (!eval_is_correct(v)) { if (!eval_is_correct(v)) {
if (m.limit().is_canceled())
return;
report_error(verbose_stream(), v); report_error(verbose_stream(), v);
TRACE(arith, report_error(tout, v)); TRACE(arith, report_error(tout, v));
UNREACHABLE(); UNREACHABLE();
@ -2707,6 +2711,8 @@ namespace sls {
void arith_base<num_t>::update_unchecked(var_t v, num_t const& new_value) { void arith_base<num_t>::update_unchecked(var_t v, num_t const& new_value) {
auto& vi = m_vars[v]; auto& vi = m_vars[v];
auto old_value = value(v); auto old_value = value(v);
if (old_value == new_value)
return;
IF_VERBOSE(5, verbose_stream() << "update: v" << v << " " << mk_bounded_pp(vi.m_expr, m) << " := " << old_value << " -> " << new_value << "\n"); IF_VERBOSE(5, verbose_stream() << "update: v" << v << " " << mk_bounded_pp(vi.m_expr, m) << " := " << old_value << " -> " << new_value << "\n");
TRACE(arith, tout << "update: v" << v << " " << mk_bounded_pp(vi.m_expr, m) << " := " << old_value << " -> " << new_value << "\n"); TRACE(arith, tout << "update: v" << v << " " << mk_bounded_pp(vi.m_expr, m) << " := " << old_value << " -> " << new_value << "\n");
vi.set_value(new_value); vi.set_value(new_value);

View file

@ -1220,32 +1220,65 @@ bool cmd_context::try_mk_builtin_app(symbol const & s, unsigned num_args, expr *
return nullptr != result.get(); return nullptr != result.get();
} }
bool cmd_context::try_mk_declared_app(symbol const & s, unsigned num_args, expr * const * args, bool cmd_context::try_mk_declared_app(symbol const &s, unsigned num_args, expr *const *args, unsigned num_indices,
unsigned num_indices, parameter const * indices, sort * range, parameter const *indices, sort *range, expr_ref &result) {
expr_ref & result) {
if (!m_func_decls.contains(s)) if (!m_func_decls.contains(s))
return false; return false;
func_decls& fs = m_func_decls.find(s); func_decls &fs = m_func_decls.find(s);
if (num_args == 0 && !range) { if (num_args == 0 && !range) {
if (fs.more_than_one()) if (fs.more_than_one())
throw cmd_exception("ambiguous constant reference, more than one constant with the same sort, use a qualified expression (as <symbol> <sort>) to disambiguate ", s); throw cmd_exception("ambiguous constant reference, more than one constant with the same sort, use a "
func_decl * f = fs.first(); "qualified expression (as <symbol> <sort>) to disambiguate ",
s);
func_decl *f = fs.first();
if (!f) if (!f)
return false; return false;
if (f->get_arity() != 0) if (f->get_arity() != 0)
result = array_util(m()).mk_as_array(f); result = array_util(m()).mk_as_array(f);
else else
result = m().mk_const(f); result = m().mk_const(f);
return true; return true;
} }
func_decl * f = fs.find(m(), num_args, args, range); func_decl *f = fs.find(m(), num_args, args, range);
if (!f)
return false; if (f) {
if (well_sorted_check_enabled()) if (f && well_sorted_check_enabled())
m().check_sort(f, num_args, args); m().check_sort(f, num_args, args);
result = m().mk_app(f, num_args, args); result = m().mk_app(f, num_args, args);
return true; return true;
}
// f could be declared as an array and applied without explicit select
if (num_args > 0 && !range) {
if (fs.more_than_one())
throw cmd_exception("ambiguous constant reference, more than one constant with the same sort, use a "
"qualified expression (as <symbol> <sort>) to disambiguate ",
s);
func_decl *f = fs.first();
if (!f)
return false;
if (f->get_arity() != 0)
return false;
array_util au(m());
auto s = f->get_range();
if (!au.is_array(s))
return false;
unsigned sz = get_array_arity(s);
if (sz != num_args)
return false;
for (unsigned i = 0; i < sz; i++)
if (args[i]->get_sort() != get_array_domain(s, i))
return false;
expr_ref_vector new_args(m());
new_args.push_back(m().mk_const(f));
for (unsigned i = 0; i < num_args; i++)
new_args.push_back(args[i]);
result = au.mk_select(new_args.size(), new_args.data());
return true;
}
return false;
} }
bool cmd_context::try_mk_macro_app(symbol const & s, unsigned num_args, expr * const * args, bool cmd_context::try_mk_macro_app(symbol const & s, unsigned num_args, expr * const * args,

View file

@ -24,6 +24,7 @@ z3_add_component(lp
monomial_bounds.cpp monomial_bounds.cpp
nex_creator.cpp nex_creator.cpp
nla_basics_lemmas.cpp nla_basics_lemmas.cpp
nla_coi.cpp
nla_common.cpp nla_common.cpp
nla_core.cpp nla_core.cpp
nla_divisions.cpp nla_divisions.cpp

View file

@ -988,7 +988,6 @@ namespace lp {
if (belongs_to_s(ei)) { if (belongs_to_s(ei)) {
remove_from_S(ei); remove_from_S(ei);
} }
SASSERT(entry_invariant(ei));
} }
void find_changed_terms_and_more_changed_rows() { void find_changed_terms_and_more_changed_rows() {
@ -1099,6 +1098,7 @@ namespace lp {
m_changed_f_columns.reset(); m_changed_f_columns.reset();
m_changed_rows.reset(); m_changed_rows.reset();
m_changed_terms.reset(); m_changed_terms.reset();
SASSERT(entries_are_ok());
} }
int get_sign_in_e_row(unsigned ei, unsigned j) const { int get_sign_in_e_row(unsigned ei, unsigned j) const {

88
src/math/lp/nla_coi.cpp Normal file
View file

@ -0,0 +1,88 @@
/*++
Copyright (c) 2025 Microsoft Corporation
Author:
Lev Nachmanson (levnach)
Nikolaj Bjorner (nbjorner)
--*/
#include "math/lp/nla_core.h"
#include "math/lp/nla_coi.h"
namespace nla {
void coi::init() {
indexed_uint_set visited;
unsigned_vector todo;
vector<occurs> var2occurs;
m_term_set.reset();
m_mon_set.reset();
m_constraint_set.reset();
m_var_set.reset();
auto& lra = c.lra_solver();
for (auto ci : lra.constraints().indices()) {
auto const& c = lra.constraints()[ci];
if (c.is_auxiliary())
continue;
for (auto const& [coeff, v] : c.coeffs()) {
var2occurs.reserve(v + 1);
var2occurs[v].constraints.push_back(ci);
}
}
for (auto const& m : c.emons()) {
for (auto v : m.vars()) {
var2occurs.reserve(v + 1);
var2occurs[v].monics.push_back(m.var());
}
}
for (const auto *t : lra.terms() ) {
for (auto const iv : *t) {
auto v = iv.j();
var2occurs.reserve(v + 1);
var2occurs[v].terms.push_back(t->j());
}
}
for (auto const& m : c.to_refine())
todo.push_back(m);
for (unsigned i = 0; i < todo.size(); ++i) {
auto v = todo[i];
if (visited.contains(v))
continue;
visited.insert(v);
m_var_set.insert(v);
var2occurs.reserve(v + 1);
for (auto ci : var2occurs[v].constraints) {
m_constraint_set.insert(ci);
auto const& c = lra.constraints()[ci];
for (auto const& [coeff, w] : c.coeffs())
todo.push_back(w);
}
for (auto w : var2occurs[v].monics)
todo.push_back(w);
for (auto ti : var2occurs[v].terms) {
for (auto iv : lra.get_term(ti))
todo.push_back(iv.j());
todo.push_back(ti);
}
if (lra.column_has_term(v)) {
m_term_set.insert(v);
for (auto kv : lra.get_term(v))
todo.push_back(kv.j());
}
if (c.is_monic_var(v)) {
m_mon_set.insert(v);
for (auto w : c.emons()[v])
todo.push_back(w);
}
}
}
}

43
src/math/lp/nla_coi.h Normal file
View file

@ -0,0 +1,43 @@
/*++
Copyright (c) 2025 Microsoft Corporation
Abstract:
Class for computing the cone of influence for NL constraints.
It includes variables that come from monomials that have incorrect evaluation and
transitively all constraints and variables that are connected.
Author:
Lev Nachmanson (levnach)
Nikolaj Bjorner (nbjorner)
--*/
#pragma once
namespace nla {
class core;
class coi {
core& c;
indexed_uint_set m_mon_set, m_constraint_set;
indexed_uint_set m_term_set, m_var_set;
struct occurs {
unsigned_vector constraints;
unsigned_vector monics;
unsigned_vector terms;
};
public:
coi(core& c) : c(c) {}
void init();
indexed_uint_set const& mons() const { return m_mon_set; }
indexed_uint_set const& constraints() const { return m_constraint_set; }
indexed_uint_set& terms() { return m_term_set; }
indexed_uint_set const &vars() { return m_var_set; }
};
}

View file

@ -1282,7 +1282,7 @@ void core::add_bounds() {
} }
} }
lbool core::check() { lbool core::check(unsigned level) {
lp_settings().stats().m_nla_calls++; lp_settings().stats().m_nla_calls++;
TRACE(nla_solver, tout << "calls = " << lp_settings().stats().m_nla_calls << "\n";); TRACE(nla_solver, tout << "calls = " << lp_settings().stats().m_nla_calls << "\n";);
lra.get_rid_of_inf_eps(); lra.get_rid_of_inf_eps();
@ -1363,7 +1363,7 @@ lbool core::check() {
ret = bounded_nlsat(); ret = bounded_nlsat();
} }
if (no_effect() && params().arith_nl_nra()) { if (no_effect() && params().arith_nl_nra() && level >= 2) {
scoped_limits sl(m_reslim); scoped_limits sl(m_reslim);
sl.push_child(&m_nra_lim); sl.push_child(&m_nra_lim);
params_ref p; params_ref p;
@ -1432,7 +1432,7 @@ bool core::no_lemmas_hold() const {
lbool core::test_check() { lbool core::test_check() {
lra.set_status(lp::lp_status::OPTIMAL); lra.set_status(lp::lp_status::OPTIMAL);
return check(); return check(2);
} }
std::unordered_set<lpvar> core::get_vars_of_expr_with_opening_terms(const nex *e ) { std::unordered_set<lpvar> core::get_vars_of_expr_with_opening_terms(const nex *e ) {

View file

@ -394,7 +394,7 @@ public:
bool conflict_found() const; bool conflict_found() const;
lbool check(); lbool check(unsigned level);
lbool check_power(lpvar r, lpvar x, lpvar y); lbool check_power(lpvar r, lpvar x, lpvar y);
void check_bounded_divisions(); void check_bounded_divisions();
@ -450,6 +450,12 @@ public:
nla_throttle& throttle() { return m_throttle; } nla_throttle& throttle() { return m_throttle; }
const nla_throttle& throttle() const { return m_throttle; } const nla_throttle& throttle() const { return m_throttle; }
lp::lar_solver& lra_solver() { return lra; }
indexed_uint_set const& to_refine() const {
return m_to_refine;
}
}; // end of core }; // end of core
struct pp_mon { struct pp_mon {

View file

@ -76,6 +76,14 @@ namespace nla {
find_nl_cluster(); find_nl_cluster();
if (!configure()) if (!configure())
return; return;
try {
if (propagate_gcd_test())
return;
}
catch (...) {
}
m_solver.saturate(); m_solver.saturate();
TRACE(grobner, m_solver.display(tout)); TRACE(grobner, m_solver.display(tout));

View file

@ -343,7 +343,7 @@ std::ostream& core::display_declarations_smt(std::ostream& out) const {
out << "); " << val(v) << " = "; out << "); " << val(v) << " = ";
rational p(1); rational p(1);
for (auto w : m.vars()) for (auto w : m.vars())
p *= val(v); p *= val(w);
out << p; out << p;
out << "\n"; out << "\n";
} }
@ -360,7 +360,6 @@ std::ostream& core::display_constraint_smt(std::ostream& out, unsigned id, lp::l
auto k = c.kind(); auto k = c.kind();
auto rhs = c.rhs(); auto rhs = c.rhs();
auto lhs = c.coeffs(); auto lhs = c.coeffs();
auto sz = lhs.size();
rational den = denominator(rhs); rational den = denominator(rhs);
for (auto [coeff, v] : lhs) for (auto [coeff, v] : lhs)
den = lcm(den, denominator(coeff)); den = lcm(den, denominator(coeff));

View file

@ -46,8 +46,8 @@ namespace nla {
bool solver::need_check() { return m_core->has_relevant_monomial(); } bool solver::need_check() { return m_core->has_relevant_monomial(); }
lbool solver::check() { lbool solver::check(unsigned level) {
return m_core->check(); return m_core->check(level);
} }
void solver::propagate() { void solver::propagate() {

View file

@ -37,7 +37,7 @@ namespace nla {
void push(); void push();
void pop(unsigned scopes); void pop(unsigned scopes);
bool need_check(); bool need_check();
lbool check(); lbool check(unsigned level);
void propagate(); void propagate();
void simplify() { m_core->simplify(); } void simplify() { m_core->simplify(); }
lbool check_power(lpvar r, lpvar x, lpvar y); lbool check_power(lpvar r, lpvar x, lpvar y);

View file

@ -9,6 +9,7 @@
#include <fstream> #include <fstream>
#include "math/lp/lar_solver.h" #include "math/lp/lar_solver.h"
#include "math/lp/nra_solver.h" #include "math/lp/nra_solver.h"
#include "math/lp/nla_coi.h"
#include "nlsat/nlsat_solver.h" #include "nlsat/nlsat_solver.h"
#include "math/polynomial/polynomial.h" #include "math/polynomial/polynomial.h"
#include "math/polynomial/algebraic_numbers.h" #include "math/polynomial/algebraic_numbers.h"
@ -25,114 +26,156 @@ typedef nla::mon_eq mon_eq;
typedef nla::variable_map_type variable_map_type; typedef nla::variable_map_type variable_map_type;
struct solver::imp { struct solver::imp {
lp::lar_solver& lra; lp::lar_solver& lra;
reslimit& m_limit; reslimit& m_limit;
params_ref m_params; params_ref m_params;
u_map<polynomial::var> m_lp2nl; // map from lar_solver variables to nlsat::solver variables u_map<polynomial::var> m_lp2nl; // map from lar_solver variables to nlsat::solver variables
indexed_uint_set m_term_set;
scoped_ptr<nlsat::solver> m_nlsat; scoped_ptr<nlsat::solver> m_nlsat;
scoped_ptr<scoped_anum_vector> m_values; // values provided by LRA solver scoped_ptr<scoped_anum_vector> m_values; // values provided by LRA solver
scoped_ptr<scoped_anum> m_tmp1, m_tmp2; scoped_ptr<scoped_anum> m_tmp1, m_tmp2;
nla::coi m_coi;
nla::core& m_nla_core; nla::core& m_nla_core;
imp(lp::lar_solver& s, reslimit& lim, params_ref const& p, nla::core& nla_core): imp(lp::lar_solver& s, reslimit& lim, params_ref const& p, nla::core& nla_core):
lra(s), lra(s),
m_limit(lim), m_limit(lim),
m_params(p), m_params(p),
m_coi(nla_core),
m_nla_core(nla_core) {} m_nla_core(nla_core) {}
bool need_check() { bool need_check() {
return m_nla_core.m_to_refine.size() != 0; return m_nla_core.m_to_refine.size() != 0;
} }
indexed_uint_set m_mon_set, m_constraint_set;
struct occurs {
unsigned_vector constraints;
unsigned_vector monics;
unsigned_vector terms;
};
void init_cone_of_influence() {
indexed_uint_set visited;
unsigned_vector todo;
vector<occurs> var2occurs;
m_term_set.reset();
m_mon_set.reset();
m_constraint_set.reset();
for (auto ci : lra.constraints().indices()) {
auto const& c = lra.constraints()[ci];
if (c.is_auxiliary())
continue;
for (auto const& [coeff, v] : c.coeffs()) {
var2occurs.reserve(v + 1);
var2occurs[v].constraints.push_back(ci);
}
}
for (auto const& m : m_nla_core.emons()) {
for (auto v : m.vars()) {
var2occurs.reserve(v + 1);
var2occurs[v].monics.push_back(m.var());
}
}
for (const auto *t : lra.terms() ) {
for (auto const iv : *t) {
auto v = iv.j();
var2occurs.reserve(v + 1);
var2occurs[v].terms.push_back(t->j());
}
}
for (auto const& m : m_nla_core.m_to_refine)
todo.push_back(m);
for (unsigned i = 0; i < todo.size(); ++i) {
auto v = todo[i];
if (visited.contains(v))
continue;
visited.insert(v);
var2occurs.reserve(v + 1);
for (auto ci : var2occurs[v].constraints) {
m_constraint_set.insert(ci);
auto const& c = lra.constraints()[ci];
for (auto const& [coeff, w] : c.coeffs())
todo.push_back(w);
}
for (auto w : var2occurs[v].monics)
todo.push_back(w);
for (auto ti : var2occurs[v].terms) {
for (auto iv : lra.get_term(ti))
todo.push_back(iv.j());
todo.push_back(ti);
}
if (lra.column_has_term(v)) {
m_term_set.insert(v);
for (auto kv : lra.get_term(v))
todo.push_back(kv.j());
}
if (m_nla_core.is_monic_var(v)) {
m_mon_set.insert(v);
for (auto w : m_nla_core.emons()[v])
todo.push_back(w);
}
}
}
void reset() { void reset() {
m_values = nullptr; m_values = nullptr;
m_tmp1 = nullptr; m_tmp2 = nullptr; m_tmp1 = nullptr; m_tmp2 = nullptr;
m_nlsat = alloc(nlsat::solver, m_limit, m_params, false); m_nlsat = alloc(nlsat::solver, m_limit, m_params, false);
m_values = alloc(scoped_anum_vector, am()); m_values = alloc(scoped_anum_vector, am());
m_term_set.reset();
m_lp2nl.reset(); m_lp2nl.reset();
} }
// Create polynomial definition for variable v used in setup_assignment_solver.
// Side-effects: updates m_vars2mon when v is a monic variable.
void mk_definition(unsigned v, polynomial_ref_vector &definitions, vector<rational>& denominators) {
auto &pm = m_nlsat->pm();
polynomial::polynomial_ref p(pm);
rational den(1);
if (m_nla_core.emons().is_monic_var(v)) {
auto const &m = m_nla_core.emons()[v];
for (auto w : m.vars()) {
den = denominators[w] * den;
polynomial_ref pw(definitions.get(w), m_nlsat->pm());
if (!p)
p = pw;
else
p = p * pw;
}
}
else if (lra.column_has_term(v)) {
for (auto const &[w, coeff] : lra.get_term(v))
den = lcm(denominator(coeff / denominators[w]), den);
for (auto const &[w, coeff] : lra.get_term(v)) {
auto coeff1 = den * coeff / denominators[w];
polynomial_ref pw(definitions.get(w), m_nlsat->pm());
if (!p)
p = constant(coeff1) * pw;
else
p = p + (constant(coeff1) * pw);
}
}
else {
p = pm.mk_polynomial(lp2nl(v)); // nlsat var index equals v (verified above when created)
}
definitions.push_back(p);
denominators.push_back(den);
}
void setup_solver_poly() {
m_coi.init();
auto &pm = m_nlsat->pm();
polynomial_ref_vector definitions(pm);
vector<rational> denominators;
for (unsigned v = 0; v < lra.number_of_vars(); ++v) {
if (m_coi.vars().contains(v)) {
auto j = m_nlsat->mk_var(lra.var_is_int(v));
m_lp2nl.insert(v, j); // we don't really need this. It is going to be the identify map.
mk_definition(v, definitions, denominators);
}
else {
definitions.push_back(nullptr);
denominators.push_back(rational(0));
}
}
// we rely on that all information encoded into the tableau is present as a constraint.
for (auto ci : m_coi.constraints()) {
auto &c = lra.constraints()[ci];
auto &pm = m_nlsat->pm();
auto k = c.kind();
auto rhs = c.rhs();
auto lhs = c.coeffs();
rational den = denominator(rhs);
//
// let v := p / denominators[v]
//
// sum(coeff[v] * v) k rhs
// ==
// sum(coeff[v] * (p / denominators[v])) k rhs
// ==
// sum((coeff[v] / denominators[v]) * p) k rhs
//
for (auto [coeff, v] : lhs)
den = lcm(den, denominator(coeff / denominators[v]));
polynomial::polynomial_ref p(pm);
p = pm.mk_const(-den * rhs);
for (auto [coeff, v] : lhs) {
polynomial_ref poly(pm);
poly = definitions.get(v);
poly = poly * constant(den * coeff / denominators[v]);
p = p + poly;
}
add_constraint(p, ci, k);
TRACE(nra, tout << "constraint " << ci << ": " << p << " " << k << " 0\n";
lra.constraints().display(tout, ci) << "\n");
}
definitions.reset();
}
void setup_solver_terms() {
m_coi.init();
// add linear inequalities from lra_solver
for (auto ci : m_coi.constraints())
add_constraint(ci);
// add polynomial definitions.
for (auto const &m : m_coi.mons())
add_monic_eq(m_nla_core.emons()[m]);
// add term definitions.
for (unsigned i : m_coi.terms())
add_term(i);
}
polynomial::polynomial_ref sub(polynomial::polynomial *a, polynomial::polynomial *b) {
return polynomial_ref(m_nlsat->pm().sub(a, b), m_nlsat->pm());
}
polynomial::polynomial_ref mul(polynomial::polynomial *a, polynomial::polynomial *b) {
return polynomial_ref(m_nlsat->pm().mul(a, b), m_nlsat->pm());
}
polynomial::polynomial_ref var(lp::lpvar v) {
return polynomial_ref(m_nlsat->pm().mk_polynomial(lp2nl(v)), m_nlsat->pm());
}
polynomial::polynomial_ref constant(rational const& r) {
return polynomial_ref(m_nlsat->pm().mk_const(r), m_nlsat->pm());
}
/** /**
\brief one-shot nlsat check. \brief one-shot nlsat check.
A one shot checker is the least functionality that can A one shot checker is the least functionality that can
@ -147,24 +190,14 @@ struct solver::imp {
lbool check() { lbool check() {
SASSERT(need_check()); SASSERT(need_check());
reset(); reset();
vector<nlsat::assumption, false> core; vector<nlsat::assumption, false> core;
init_cone_of_influence();
// add linear inequalities from lra_solver
for (auto ci : m_constraint_set)
add_constraint(ci);
// add polynomial definitions. smt_params_helper p(m_params);
for (auto const& m : m_mon_set)
add_monic_eq(m_nla_core.emons()[m]);
// add term definitions. setup_solver_poly();
for (unsigned i : m_term_set)
add_term(i);
TRACE(nra, m_nlsat->display(tout)); TRACE(nra, m_nlsat->display(tout));
smt_params_helper p(m_params);
if (p.arith_nl_log()) { if (p.arith_nl_log()) {
static unsigned id = 0; static unsigned id = 0;
std::stringstream strm; std::stringstream strm;
@ -196,12 +229,14 @@ struct solver::imp {
} }
} }
m_nlsat->collect_statistics(st); m_nlsat->collect_statistics(st);
TRACE(nra, TRACE(nra, tout << "nra result " << r << "\n");
CTRACE(nra, false,
m_nlsat->display(tout << r << "\n"); m_nlsat->display(tout << r << "\n");
display(tout); display(tout);
for (auto [j, x] : m_lp2nl) tout << "j" << j << " := x" << x << "\n";); for (auto [j, x] : m_lp2nl) tout << "j" << j << " := x" << x << "\n";);
switch (r) { switch (r) {
case l_true: case l_true:
m_nlsat->restore_order();
m_nla_core.set_use_nra_model(true); m_nla_core.set_use_nra_model(true);
lra.init_model(); lra.init_model();
for (lp::constraint_index ci : lra.constraints().indices()) for (lp::constraint_index ci : lra.constraints().indices())
@ -223,14 +258,15 @@ struct solver::imp {
case l_false: { case l_false: {
lp::explanation ex; lp::explanation ex;
m_nlsat->get_core(core); m_nlsat->get_core(core);
for (auto c : core) {
unsigned idx = static_cast<unsigned>(static_cast<imp*>(c) - this);
ex.push_back(idx);
TRACE(nra, lra.display_constraint(tout << "ex: " << idx << ": ", idx) << "\n";);
}
nla::lemma_builder lemma(m_nla_core, __FUNCTION__); nla::lemma_builder lemma(m_nla_core, __FUNCTION__);
for (auto c : core) {
unsigned idx = static_cast<unsigned>(static_cast<imp *>(c) - this);
ex.push_back(idx);
}
lemma &= ex; lemma &= ex;
m_nla_core.set_use_nra_model(true); m_nla_core.set_use_nra_model(true);
TRACE(nra, tout << lemma << "\n");
break; break;
} }
case l_undef: case l_undef:
@ -272,12 +308,24 @@ struct solver::imp {
coeffs.push_back(mpz(1)); coeffs.push_back(mpz(1));
coeffs.push_back(mpz(-1)); coeffs.push_back(mpz(-1));
polynomial::polynomial_ref p(pm.mk_polynomial(2, coeffs.data(), mls), pm); polynomial::polynomial_ref p(pm.mk_polynomial(2, coeffs.data(), mls), pm);
polynomial::polynomial* ps[1] = { p }; auto lit = mk_literal(p.get(), lp::lconstraint_kind::EQ);
bool even[1] = { false };
nlsat::literal lit = m_nlsat->mk_ineq_literal(nlsat::atom::kind::EQ, 1, ps, even);
m_nlsat->mk_clause(1, &lit, nullptr); m_nlsat->mk_clause(1, &lit, nullptr);
} }
nlsat::literal mk_literal(polynomial::polynomial *p, lp::lconstraint_kind k) {
polynomial::polynomial *ps[1] = { p };
bool is_even[1] = { false };
switch (k) {
case lp::lconstraint_kind::LE: return ~m_nlsat->mk_ineq_literal(nlsat::atom::kind::GT, 1, ps, is_even);
case lp::lconstraint_kind::GE: return ~m_nlsat->mk_ineq_literal(nlsat::atom::kind::LT, 1, ps, is_even);
case lp::lconstraint_kind::LT: return m_nlsat->mk_ineq_literal(nlsat::atom::kind::LT, 1, ps, is_even);
case lp::lconstraint_kind::GT: return m_nlsat->mk_ineq_literal(nlsat::atom::kind::GT, 1, ps, is_even);
case lp::lconstraint_kind::EQ: return m_nlsat->mk_ineq_literal(nlsat::atom::kind::EQ, 1, ps, is_even);
default: UNREACHABLE(); // unreachable
}
throw default_exception("uexpected operator");
}
void add_constraint(unsigned idx) { void add_constraint(unsigned idx) {
auto& c = lra.constraints()[idx]; auto& c = lra.constraints()[idx];
auto& pm = m_nlsat->pm(); auto& pm = m_nlsat->pm();
@ -297,30 +345,26 @@ struct solver::imp {
} }
rhs *= den; rhs *= den;
polynomial::polynomial_ref p(pm.mk_linear(sz, coeffs.data(), vars.data(), -rhs), pm); polynomial::polynomial_ref p(pm.mk_linear(sz, coeffs.data(), vars.data(), -rhs), pm);
polynomial::polynomial* ps[1] = { p }; nlsat::literal lit = mk_literal(p.get(), k);
bool is_even[1] = { false }; nlsat::assumption a = this + idx;
m_nlsat->mk_clause(1, &lit, a);
}
nlsat::literal add_constraint(polynomial::polynomial *p, unsigned idx, lp::lconstraint_kind k) {
polynomial::polynomial *ps[1] = {p};
bool is_even[1] = {false};
nlsat::literal lit; nlsat::literal lit;
nlsat::assumption a = this + idx; nlsat::assumption a = this + idx;
switch (k) { switch (k) {
case lp::lconstraint_kind::LE: case lp::lconstraint_kind::LE: lit = ~m_nlsat->mk_ineq_literal(nlsat::atom::kind::GT, 1, ps, is_even); break;
lit = ~m_nlsat->mk_ineq_literal(nlsat::atom::kind::GT, 1, ps, is_even); case lp::lconstraint_kind::GE: lit = ~m_nlsat->mk_ineq_literal(nlsat::atom::kind::LT, 1, ps, is_even); break;
break; case lp::lconstraint_kind::LT: lit = m_nlsat->mk_ineq_literal(nlsat::atom::kind::LT, 1, ps, is_even); break;
case lp::lconstraint_kind::GE: case lp::lconstraint_kind::GT: lit = m_nlsat->mk_ineq_literal(nlsat::atom::kind::GT, 1, ps, is_even); break;
lit = ~m_nlsat->mk_ineq_literal(nlsat::atom::kind::LT, 1, ps, is_even); case lp::lconstraint_kind::EQ: lit = m_nlsat->mk_ineq_literal(nlsat::atom::kind::EQ, 1, ps, is_even); break;
break; default: UNREACHABLE(); // unreachable
case lp::lconstraint_kind::LT:
lit = m_nlsat->mk_ineq_literal(nlsat::atom::kind::LT, 1, ps, is_even);
break;
case lp::lconstraint_kind::GT:
lit = m_nlsat->mk_ineq_literal(nlsat::atom::kind::GT, 1, ps, is_even);
break;
case lp::lconstraint_kind::EQ:
lit = m_nlsat->mk_ineq_literal(nlsat::atom::kind::EQ, 1, ps, is_even);
break;
default:
UNREACHABLE(); // unreachable
} }
m_nlsat->mk_clause(1, &lit, a); m_nlsat->mk_clause(1, &lit, a);
return lit;
} }
bool check_monic(mon_eq const& m) { bool check_monic(mon_eq const& m) {
@ -370,7 +414,7 @@ struct solver::imp {
for (auto const& m : m_nla_core.emons()) for (auto const& m : m_nla_core.emons())
if (any_of(m.vars(), [&](lp::lpvar v) { return m_lp2nl.contains(v); })) if (any_of(m.vars(), [&](lp::lpvar v) { return m_lp2nl.contains(v); }))
add_monic_eq_bound(m); add_monic_eq_bound(m);
for (unsigned i : m_term_set) for (unsigned i : m_coi.terms())
add_term(i); add_term(i);
for (auto const& [v, w] : m_lp2nl) { for (auto const& [v, w] : m_lp2nl) {
if (lra.column_has_lower_bound(v)) if (lra.column_has_lower_bound(v))
@ -397,6 +441,7 @@ struct solver::imp {
switch (r) { switch (r) {
case l_true: case l_true:
m_nlsat->restore_order();
m_nla_core.set_use_nra_model(true); m_nla_core.set_use_nra_model(true);
lra.init_model(); lra.init_model();
for (lp::constraint_index ci : lra.constraints().indices()) for (lp::constraint_index ci : lra.constraints().indices())
@ -418,6 +463,7 @@ struct solver::imp {
ex.push_back(ci); ex.push_back(ci);
nla::lemma_builder lemma(m_nla_core, __FUNCTION__); nla::lemma_builder lemma(m_nla_core, __FUNCTION__);
lemma &= ex; lemma &= ex;
TRACE(nra, tout << lemma << "\n");
break; break;
} }
case l_undef: case l_undef:
@ -554,8 +600,8 @@ struct solver::imp {
if (!m_lp2nl.find(v, r)) { if (!m_lp2nl.find(v, r)) {
r = m_nlsat->mk_var(is_int(v)); r = m_nlsat->mk_var(is_int(v));
m_lp2nl.insert(v, r); m_lp2nl.insert(v, r);
if (!m_term_set.contains(v) && lra.column_has_term(v)) { if (!m_coi.terms().contains(v) && lra.column_has_term(v)) {
m_term_set.insert(v); m_coi.terms().insert(v);
} }
} }
return r; return r;
@ -586,20 +632,56 @@ struct solver::imp {
m_nlsat->mk_clause(1, &lit, nullptr); m_nlsat->mk_clause(1, &lit, nullptr);
} }
nlsat::anum const& value(lp::lpvar v) { nlsat::anum const &value(lp::lpvar v) {
polynomial::var pv; init_values(v + 1);
if (m_lp2nl.find(v, pv)) return (*m_values)[v];
return m_nlsat->value(pv); }
else {
for (unsigned w = m_values->size(); w <= v; ++w) { void init_values(unsigned sz) {
scoped_anum a(am()); if (m_values->size() >= sz)
am().set(a, m_nla_core.val(w).to_mpq()); return;
unsigned w;
scoped_anum a(am());
for (unsigned v = m_values->size(); v < sz; ++v) {
if (m_nla_core.emons().is_monic_var(v)) {
am().set(a, 1);
auto &m = m_nla_core.emon(v);
for (auto x : m.vars())
am().mul(a, (*m_values)[x], a);
m_values->push_back(a); m_values->push_back(a);
} }
return (*m_values)[v]; else if (lra.column_has_term(v)) {
scoped_anum b(am());
am().set(a, 0);
for (auto const &[w, coeff] : lra.get_term(v)) {
am().set(b, coeff.to_mpq());
am().mul(b, (*m_values)[w], b);
am().add(a, b, a);
}
m_values->push_back(a);
}
else if (m_lp2nl.find(v, w)) {
m_values->push_back(m_nlsat->value(w));
}
else {
am().set(a, m_nla_core.val(v).to_mpq());
m_values->push_back(a);
}
} }
} }
void set_value(lp::lpvar v, rational const& value) {
if (!m_values)
m_values = alloc(scoped_anum_vector, am());
scoped_anum a(am());
am().set(a, value.to_mpq());
while (m_values->size() <= v)
m_values->push_back(a);
am().set((*m_values)[v], a);
}
nlsat::anum_manager& am() { nlsat::anum_manager& am() {
return m_nlsat->am(); return m_nlsat->am();
} }
@ -680,4 +762,8 @@ void solver::updt_params(params_ref& p) {
m_imp->updt_params(p); m_imp->updt_params(p);
} }
void solver::set_value(lp::lpvar v, rational const& value) {
m_imp->set_value(v, value);
}
} }

View file

@ -59,6 +59,8 @@ namespace nra {
nlsat::anum_manager& am(); nlsat::anum_manager& am();
void set_value(lp::lpvar v, rational const &value);
scoped_anum& tmp1(); scoped_anum& tmp1();
scoped_anum& tmp2(); scoped_anum& tmp2();

View file

@ -2772,9 +2772,12 @@ namespace algebraic_numbers {
return out; return out;
} }
std::ostream& display_root_smt2(std::ostream & out, numeral const & a) { template<typename Printer>
std::ostream& display_root_common(std::ostream & out, numeral const & a, char const* var_name, bool no_power, Printer&& printer) {
SASSERT(var_name != nullptr);
if (is_zero(a)) { if (is_zero(a)) {
out << "(root-obj x 1)"; auto poly_printer = [&](std::ostream& dst) { dst << var_name; };
return printer(out, poly_printer, 1u);
} }
else if (a.is_basic()) { else if (a.is_basic()) {
mpq const & v = basic_value(a); mpq const & v = basic_value(a);
@ -2782,25 +2785,53 @@ namespace algebraic_numbers {
qm().set(neg_n, v.numerator()); qm().set(neg_n, v.numerator());
qm().neg(neg_n); qm().neg(neg_n);
mpz coeffs[2] = { std::move(neg_n), qm().dup(v.denominator()) }; mpz coeffs[2] = { std::move(neg_n), qm().dup(v.denominator()) };
out << "(root-obj "; auto poly_printer = [&](std::ostream& dst) {
upm().display_smt2(out, 2, coeffs, "x"); if (no_power)
out << " 1)"; // first root of the polynomial d*# - n upm().display_smt2_no_power(dst, 2, coeffs, var_name);
else
upm().display_smt2(dst, 2, coeffs, var_name);
};
std::ostream& r = printer(out, poly_printer, 1u); // first root of d*x - n
qm().del(coeffs[0]); qm().del(coeffs[0]);
qm().del(coeffs[1]); qm().del(coeffs[1]);
return r;
} }
else { else {
algebraic_cell * c = a.to_algebraic(); algebraic_cell * c = a.to_algebraic();
out << "(root-obj "; auto poly_printer = [&](std::ostream& dst) {
upm().display_smt2(out, c->m_p_sz, c->m_p, "x"); if (no_power)
upm().display_smt2_no_power(dst, c->m_p_sz, c->m_p, var_name);
else
upm().display_smt2(dst, c->m_p_sz, c->m_p, var_name);
};
if (c->m_i == 0) { if (c->m_i == 0) {
// undefined // undefined
c->m_i = upm().get_root_id(c->m_p_sz, c->m_p, lower(c)) + 1; c->m_i = upm().get_root_id(c->m_p_sz, c->m_p, lower(c)) + 1;
} }
SASSERT(c->m_i > 0); SASSERT(c->m_i > 0);
out << " " << c->m_i; return printer(out, poly_printer, c->m_i);
out << ")";
} }
return out; }
std::ostream& display_root_smt2(std::ostream & out, numeral const & a) {
auto printer = [&](std::ostream& dst, auto const& poly_printer, unsigned idx) -> std::ostream& {
dst << "(root-obj ";
poly_printer(dst);
dst << " " << idx << ")";
return dst;
};
return display_root_common(out, a, "x", false, printer);
}
std::ostream& display_root_smtrat(std::ostream & out, numeral const & a, char const* var_name) {
SASSERT(var_name != nullptr);
auto printer = [&](std::ostream& dst, auto const& poly_printer, unsigned idx) -> std::ostream& {
dst << "(root ";
poly_printer(dst);
dst << " " << idx << " " << var_name << ")";
return dst;
};
return display_root_common(out, a, var_name, true, printer);
} }
std::ostream& display_interval(std::ostream & out, numeral const & a) { std::ostream& display_interval(std::ostream & out, numeral const & a) {
@ -3167,6 +3198,10 @@ namespace algebraic_numbers {
return m_imp->display_root_smt2(out, a); return m_imp->display_root_smt2(out, a);
} }
std::ostream& manager::display_root_smtrat(std::ostream & out, numeral const & a, char const* var_name) const {
return m_imp->display_root_smtrat(out, a, var_name);
}
void manager::reset_statistics() { void manager::reset_statistics() {
m_imp->reset_statistics(); m_imp->reset_statistics();
} }

View file

@ -345,6 +345,12 @@ namespace algebraic_numbers {
*/ */
std::ostream& display_root_smt2(std::ostream & out, numeral const & a) const; std::ostream& display_root_smt2(std::ostream & out, numeral const & a) const;
/**
\brief Display algebraic number using an SMT-RAT style root expression: (root p i x)
where the final argument denotes the variable bound to this root.
*/
std::ostream& display_root_smtrat(std::ostream & out, numeral const & a, char const* var_name) const;
/** /**
\brief Display algebraic number in Mathematica format. \brief Display algebraic number in Mathematica format.
*/ */
@ -495,4 +501,3 @@ inline std::ostream & operator<<(std::ostream & out, interval_pp const & n) {
n.m.display_interval(out, n.n); n.m.display_interval(out, n.n);
return out; return out;
} }

View file

@ -1159,67 +1159,89 @@ namespace upolynomial {
} }
} }
static void display_smt2_var_power(std::ostream & out, char const * var_name, unsigned k, bool allow_power) {
SASSERT(k > 0);
if (k == 1) {
out << var_name;
}
else if (allow_power) {
out << "(^ " << var_name << " " << k << ")";
}
else {
out << "(*";
for (unsigned i = 0; i < k; ++i)
out << " " << var_name;
out << ")";
}
}
static void display_smt2_monomial(std::ostream & out, numeral_manager & m, mpz const & n, static void display_smt2_monomial(std::ostream & out, numeral_manager & m, mpz const & n,
unsigned k, char const * var_name) { unsigned k, char const * var_name, bool allow_power) {
if (k == 0) { if (k == 0) {
display_smt2_mumeral(out, m, n); display_smt2_mumeral(out, m, n);
} }
else if (m.is_one(n)) { else if (m.is_one(n)) {
if (k == 1) display_smt2_var_power(out, var_name, k, allow_power);
out << var_name;
else
out << "(^ " << var_name << " " << k << ")";
} }
else { else {
out << "(* "; out << "(* ";
display_smt2_mumeral(out, m, n); display_smt2_mumeral(out, m, n);
out << " "; out << " ";
if (k == 1) display_smt2_var_power(out, var_name, k, allow_power);
out << var_name;
else
out << "(^ " << var_name << " " << k << ")";
out << ")"; out << ")";
} }
} }
// Display p as an s-expression static std::ostream& display_smt2_core(std::ostream & out, core_manager const& cm, unsigned sz, numeral const * p, char const * var_name, bool allow_power) {
std::ostream& core_manager::display_smt2(std::ostream & out, unsigned sz, numeral const * p, char const * var_name) const {
if (sz == 0) { if (sz == 0) {
out << "0"; out << "0";
return out; return out;
} }
if (sz == 1) { if (sz == 1) {
display_smt2_mumeral(out, m(), p[0]); display_smt2_mumeral(out, cm.m(), p[0]);
return out; return out;
} }
unsigned non_zero_idx = UINT_MAX; unsigned non_zero_idx = UINT_MAX;
unsigned num_non_zeros = 0; unsigned num_non_zeros = 0;
for (unsigned i = 0; i < sz; i++) { for (unsigned i = 0; i < sz; i++) {
if (m().is_zero(p[i])) if (cm.m().is_zero(p[i]))
continue; continue;
non_zero_idx = i; non_zero_idx = i;
num_non_zeros ++; num_non_zeros ++;
} }
if (num_non_zeros == 1) { if (num_non_zeros == 1 && non_zero_idx != UINT_MAX) {
SASSERT(non_zero_idx != UINT_MAX && non_zero_idx >= 1); if (non_zero_idx == 0) {
display_smt2_monomial(out, m(), p[non_zero_idx], non_zero_idx, var_name); display_smt2_mumeral(out, cm.m(), p[0]);
return out;
}
display_smt2_monomial(out, cm.m(), p[non_zero_idx], non_zero_idx, var_name, allow_power);
return out;
} }
out << "(+"; out << "(+";
unsigned i = sz; unsigned i = sz;
while (i > 0) { while (i > 0) {
--i; --i;
if (!m().is_zero(p[i])) { if (!cm.m().is_zero(p[i])) {
out << " "; out << " ";
display_smt2_monomial(out, m(), p[i], i, var_name); display_smt2_monomial(out, cm.m(), p[i], i, var_name, allow_power);
} }
} }
return out << ")"; return out << ")";
} }
// Display p as an s-expression
std::ostream& core_manager::display_smt2(std::ostream & out, unsigned sz, numeral const * p, char const * var_name) const {
return display_smt2_core(out, *this, sz, p, var_name, true);
}
std::ostream& core_manager::display_smt2_no_power(std::ostream & out, unsigned sz, numeral const * p, char const * var_name) const {
return display_smt2_core(out, *this, sz, p, var_name, false);
}
bool core_manager::eq(unsigned sz1, numeral const * p1, unsigned sz2, numeral const * p2) { bool core_manager::eq(unsigned sz1, numeral const * p1, unsigned sz2, numeral const * p2) {
if (sz1 != sz2) if (sz1 != sz2)
return false; return false;
@ -3117,4 +3139,3 @@ namespace upolynomial {
return out; return out;
} }
}; };

View file

@ -468,6 +468,7 @@ namespace upolynomial {
std::ostream& display_smt2(std::ostream & out, numeral_vector const & p, char const * var_name = "x") const { std::ostream& display_smt2(std::ostream & out, numeral_vector const & p, char const * var_name = "x") const {
return display_smt2(out, p.size(), p.data(), var_name); return display_smt2(out, p.size(), p.data(), var_name);
} }
std::ostream& display_smt2_no_power(std::ostream & out, unsigned sz, numeral const * p, char const * var_name = "x") const;
}; };
class scoped_set_z { class scoped_set_z {
@ -917,4 +918,3 @@ namespace upolynomial {
}; };
}; };

View file

@ -1021,7 +1021,7 @@ namespace realclosure {
} }
static bool is_rational_function(numeral const & a) { static bool is_rational_function(numeral const & a) {
return is_rational_function(a.m_value); return !is_zero(a) && is_rational_function(a.m_value);
} }
static rational_function_value * to_rational_function(numeral const & a) { static rational_function_value * to_rational_function(numeral const & a) {
@ -2521,7 +2521,7 @@ namespace realclosure {
\brief Return true if a is a rational. \brief Return true if a is a rational.
*/ */
bool is_rational(numeral const & a) { bool is_rational(numeral const & a) {
return a.m_value->is_rational(); return is_zero(a) || a.m_value->is_rational();
} }
@ -3429,7 +3429,7 @@ namespace realclosure {
} }
} }
bool get_interval(numeral const & a, int & lower_is_inf, int & lower_is_open, numeral & lower, int & upper_is_inf, int & upper_is_open, numeral & upper) bool get_interval(numeral const & a, bool & lower_is_inf, bool & lower_is_open, numeral & lower, bool & upper_is_inf, bool & upper_is_open, numeral & upper)
{ {
if (!is_algebraic(a)) if (!is_algebraic(a))
return false; return false;
@ -6475,7 +6475,7 @@ namespace realclosure {
return m_imp->get_sign_condition_sign(a, i); return m_imp->get_sign_condition_sign(a, i);
} }
bool manager::get_interval(numeral const & a, int & lower_is_inf, int & lower_is_open, numeral & lower, int & upper_is_inf, int & upper_is_open, numeral & upper) bool manager::get_interval(numeral const & a, bool & lower_is_inf, bool & lower_is_open, numeral & lower, bool & upper_is_inf, bool & upper_is_open, numeral & upper)
{ {
return m_imp->get_interval(a, lower_is_inf, lower_is_open, lower, upper_is_inf, upper_is_open, upper); return m_imp->get_interval(a, lower_is_inf, lower_is_open, lower, upper_is_inf, upper_is_open, upper);
} }

View file

@ -298,7 +298,7 @@ namespace realclosure {
int get_sign_condition_sign(numeral const &a, unsigned i); int get_sign_condition_sign(numeral const &a, unsigned i);
bool get_interval(numeral const & a, int & lower_is_inf, int & lower_is_open, numeral & lower, int & upper_is_inf, int & upper_is_open, numeral & upper); bool get_interval(numeral const & a, bool & lower_is_inf, bool & lower_is_open, numeral & lower, bool & upper_is_inf, bool & upper_is_open, numeral & upper);
unsigned num_sign_condition_coefficients(numeral const &a, unsigned i); unsigned num_sign_condition_coefficients(numeral const &a, unsigned i);

File diff suppressed because it is too large Load diff

View file

@ -45,7 +45,7 @@ namespace nlsat {
void set_minimize_cores(bool f); void set_minimize_cores(bool f);
void set_factor(bool f); void set_factor(bool f);
void set_add_all_coeffs(bool f); void set_add_all_coeffs(bool f);
void set_signed_project(bool f); void set_add_zero_disc(bool f);
/** /**
\brief Given a set of literals ls[0], ... ls[n-1] s.t. \brief Given a set of literals ls[0], ... ls[n-1] s.t.

View file

@ -9,6 +9,7 @@ def_module_params('nlsat',
('lazy', UINT, 0, "how lazy the solver is."), ('lazy', UINT, 0, "how lazy the solver is."),
('reorder', BOOL, True, "reorder variables."), ('reorder', BOOL, True, "reorder variables."),
('log_lemmas', BOOL, False, "display lemmas as self-contained SMT formulas"), ('log_lemmas', BOOL, False, "display lemmas as self-contained SMT formulas"),
('log_lemma_smtrat', BOOL, False, "log lemmas to be readable by smtrat"),
('dump_mathematica', BOOL, False, "display lemmas as matematica"), ('dump_mathematica', BOOL, False, "display lemmas as matematica"),
('check_lemmas', BOOL, False, "check lemmas on the fly using an independent nlsat solver"), ('check_lemmas', BOOL, False, "check lemmas on the fly using an independent nlsat solver"),
('simplify_conflicts', BOOL, True, "simplify conflicts using equalities before resolving them in nlsat solver."), ('simplify_conflicts', BOOL, True, "simplify conflicts using equalities before resolving them in nlsat solver."),
@ -20,5 +21,7 @@ def_module_params('nlsat',
('seed', UINT, 0, "random seed."), ('seed', UINT, 0, "random seed."),
('factor', BOOL, True, "factor polynomials produced during conflict resolution."), ('factor', BOOL, True, "factor polynomials produced during conflict resolution."),
('add_all_coeffs', BOOL, False, "add all polynomial coefficients during projection."), ('add_all_coeffs', BOOL, False, "add all polynomial coefficients during projection."),
('zero_disc', BOOL, False, "add_zero_assumption to the vanishing discriminant."),
('known_sat_assignment_file_name', STRING, "", "the file name of a known solution: used for debugging only") ('known_sat_assignment_file_name', STRING, "", "the file name of a known solution: used for debugging only")
)) ))

View file

@ -219,9 +219,11 @@ namespace nlsat {
unsigned m_random_seed; unsigned m_random_seed;
bool m_inline_vars; bool m_inline_vars;
bool m_log_lemmas; bool m_log_lemmas;
bool m_log_lemma_smtrat;
bool m_dump_mathematica; bool m_dump_mathematica;
bool m_check_lemmas; bool m_check_lemmas;
unsigned m_max_conflicts; unsigned m_max_conflicts;
unsigned m_lemma_rlimit;
unsigned m_lemma_count; unsigned m_lemma_count;
unsigned m_variable_ordering_strategy; unsigned m_variable_ordering_strategy;
bool m_set_0_more; bool m_set_0_more;
@ -269,6 +271,7 @@ namespace nlsat {
reset_statistics(); reset_statistics();
mk_true_bvar(); mk_true_bvar();
m_lemma_count = 0; m_lemma_count = 0;
m_lemma_rlimit = 100 * 1000; // one hundred seconds
} }
~imp() { ~imp() {
@ -295,6 +298,7 @@ namespace nlsat {
m_random_seed = p.seed(); m_random_seed = p.seed();
m_inline_vars = p.inline_vars(); m_inline_vars = p.inline_vars();
m_log_lemmas = p.log_lemmas(); m_log_lemmas = p.log_lemmas();
m_log_lemma_smtrat = p.log_lemma_smtrat();
m_dump_mathematica= p.dump_mathematica(); m_dump_mathematica= p.dump_mathematica();
m_check_lemmas = p.check_lemmas(); m_check_lemmas = p.check_lemmas();
m_variable_ordering_strategy = p.variable_ordering_strategy(); m_variable_ordering_strategy = p.variable_ordering_strategy();
@ -307,6 +311,7 @@ namespace nlsat {
m_explain.set_minimize_cores(min_cores); m_explain.set_minimize_cores(min_cores);
m_explain.set_factor(p.factor()); m_explain.set_factor(p.factor());
m_explain.set_add_all_coeffs(p.add_all_coeffs()); m_explain.set_add_all_coeffs(p.add_all_coeffs());
m_explain.set_add_zero_disc(p.zero_disc());
m_am.updt_params(p.p); m_am.updt_params(p.p);
} }
@ -750,6 +755,14 @@ namespace nlsat {
m_atoms[b] = new_atom; m_atoms[b] = new_atom;
new_atom->m_bool_var = b; new_atom->m_bool_var = b;
m_pm.inc_ref(new_atom->p()); m_pm.inc_ref(new_atom->p());
TRACE(nlsat_solver,
tout << "created root literal b" << b << ": ";
display(tout, literal(b, false)) << "\n";
tout << " kind: " << k << ", index: " << i << ", variable: x" << x << "\n";
tout << " polynomial: ";
display_polynomial(tout, new_atom->p(), m_display_var);
tout << "\n";
);
return b; return b;
} }
@ -971,8 +984,7 @@ namespace nlsat {
lbool val = l_undef; lbool val = l_undef;
// Arithmetic atom: evaluate directly // Arithmetic atom: evaluate directly
var max = a->max_var(); SASSERT(debug_assignment.is_assigned(a->max_var()));
SASSERT(debug_assignment.is_assigned(max));
val = to_lbool(debug_evaluator.eval(a, l.sign())); val = to_lbool(debug_evaluator.eval(a, l.sign()));
SASSERT(val != l_undef); SASSERT(val != l_undef);
if (val == l_true) if (val == l_true)
@ -1110,25 +1122,39 @@ namespace nlsat {
} }
} }
void log_lemma(std::ostream& out, clause const& cls) { void log_lemma(std::ostream& out, clause const& cls, std::string annotation) {
log_lemma(out, cls.size(), cls.data(), false); log_lemma(out, cls.size(), cls.data(), true, annotation);
} }
void log_lemma(std::ostream& out, unsigned n, literal const* cls, bool is_valid) { void log_lemma(std::ostream& out, unsigned n, literal const* cls, bool is_valid, std::string annotation) {
++m_lemma_count; bool_vector used_vars(num_vars(), false);
out << "(set-logic ALL)\n"; bool_vector used_bools(usize(m_atoms), false);
if (is_valid) { var_vector vars;
display_smt2_bool_decls(out); for (unsigned j = 0; j < n; j++) {
display_smt2_arith_decls(out); literal lit = cls[j];
bool_var b = lit.var();
if (b != null_bool_var && b < used_bools.size())
used_bools[b] = true;
vars.reset();
this->vars(lit, vars);
for (var v : vars)
used_vars[v] = true;
} }
else display(out << "(echo \"#" << m_lemma_count++ << ":" << annotation << ":", n, cls) << "\")\n";
display_smt2(out); if (m_log_lemma_smtrat)
out << "(set-logic NRA)\n";
else
out << "(set-logic ALL)\n";
out << "(set-option :rlimit " << m_lemma_rlimit << ")\n";
if (is_valid) {
display_smt2_bool_decls(out, used_bools);
display_smt2_arith_decls(out, used_vars);
}
for (unsigned i = 0; i < n; ++i) for (unsigned i = 0; i < n; ++i)
display_smt2(out << "(assert ", ~cls[i]) << ")\n"; display_smt2(out << "(assert ", ~cls[i]) << ")\n";
display(out << "(echo \"#" << m_lemma_count << " ", n, cls) << "\")\n";
out << "(check-sat)\n(reset)\n"; out << "(check-sat)\n(reset)\n";
TRACE(nlsat, display(tout << "(echo \"#" << m_lemma_count << " ", n, cls) << "\")\n");
} }
clause * mk_clause_core(unsigned num_lits, literal const * lits, bool learned, _assumption_set a) { clause * mk_clause_core(unsigned num_lits, literal const * lits, bool learned, _assumption_set a) {
@ -1152,12 +1178,6 @@ namespace nlsat {
TRACE(nlsat_sort, display(tout << "mk_clause:\n", *cls) << "\n";); TRACE(nlsat_sort, display(tout << "mk_clause:\n", *cls) << "\n";);
std::sort(cls->begin(), cls->end(), lit_lt(*this)); std::sort(cls->begin(), cls->end(), lit_lt(*this));
TRACE(nlsat, display(tout << " after sort:\n", *cls) << "\n";); TRACE(nlsat, display(tout << " after sort:\n", *cls) << "\n";);
if (learned && m_log_lemmas) {
log_lemma(verbose_stream(), *cls);
}
if (learned && m_check_lemmas) {
check_lemma(cls->size(), cls->data(), false, cls->assumptions());
}
if (learned) if (learned)
m_learned.push_back(cls); m_learned.push_back(cls);
else else
@ -1553,7 +1573,7 @@ namespace nlsat {
unsigned first_undef = UINT_MAX; // position of the first undefined literal unsigned first_undef = UINT_MAX; // position of the first undefined literal
interval_set_ref first_undef_set(m_ism); // infeasible region of the first undefined literal interval_set_ref first_undef_set(m_ism); // infeasible region of the first undefined literal
interval_set * xk_set = m_infeasible[m_xk]; // current set of infeasible interval for current variable interval_set * xk_set = m_infeasible[m_xk]; // current set of infeasible interval for current variable
TRACE(nlsat_inf_set, tout << "m_infeasible["<< debug_get_var_name(m_xk) << "]:"; TRACE(nlsat_inf_set, tout << "m_infeasible[x"<< m_xk << "]:";
m_ism.display(tout, xk_set) << "\n";); m_ism.display(tout, xk_set) << "\n";);
SASSERT(!m_ism.is_full(xk_set)); SASSERT(!m_ism.is_full(xk_set));
for (unsigned idx = 0; idx < cls.size(); ++idx) { for (unsigned idx = 0; idx < cls.size(); ++idx) {
@ -1573,7 +1593,7 @@ namespace nlsat {
SASSERT(a != nullptr); SASSERT(a != nullptr);
interval_set_ref curr_set(m_ism); interval_set_ref curr_set(m_ism);
curr_set = m_evaluator.infeasible_intervals(a, l.sign(), &cls); curr_set = m_evaluator.infeasible_intervals(a, l.sign(), &cls);
TRACE(nlsat_inf_set, TRACE(nlsat_inf_set,
tout << "infeasible set for literal: "; display(tout, l); tout << "\n"; m_ism.display(tout, curr_set); tout << "\n"; tout << "infeasible set for literal: "; display(tout, l); tout << "\n"; m_ism.display(tout, curr_set); tout << "\n";
display(tout << "cls: " , cls) << "\n"; display(tout << "cls: " , cls) << "\n";
tout << "m_xk:" << m_xk << "(" << debug_get_var_name(m_xk) << ")"<< "\n";); tout << "m_xk:" << m_xk << "(" << debug_get_var_name(m_xk) << ")"<< "\n";);
@ -1599,7 +1619,16 @@ namespace nlsat {
TRACE(nlsat_inf_set, tout << "infeasible set + current set = R, skip literal\n"; TRACE(nlsat_inf_set, tout << "infeasible set + current set = R, skip literal\n";
display(tout, cls) << "\n"; display(tout, cls) << "\n";
display_assignment_for_clause(tout, cls); display_assignment_for_clause(tout, cls);
m_ism.display(tout, tmp); tout << "\n"; m_ism.display(tout, tmp) << "\n";
literal_vector inf_lits;
ptr_vector<clause> inf_clauses;
m_ism.get_justifications(tmp, inf_lits, inf_clauses);
if (!inf_lits.empty()) {
tout << "Interval witnesses:\n";
for (literal inf_lit : inf_lits) {
display(tout << " ", inf_lit) << "\n";
}
}
); );
R_propagate(~l, tmp, false); R_propagate(~l, tmp, false);
continue; continue;
@ -1869,6 +1898,14 @@ namespace nlsat {
<< " :learned " << m_learned.size() << ")\n"); << " :learned " << m_learned.size() << ")\n");
} }
void try_reorder() {
gc();
if (m_stats.m_restarts % 10)
return;
if (m_reordered)
restore_order();
apply_reorder();
}
lbool search_check() { lbool search_check() {
lbool r = l_undef; lbool r = l_undef;
@ -1880,6 +1917,9 @@ namespace nlsat {
if (r != l_true) if (r != l_true)
break; break;
++m_stats.m_restarts; ++m_stats.m_restarts;
try_reorder();
vector<std::pair<var, rational>> bounds; vector<std::pair<var, rational>> bounds;
for (var x = 0; x < num_vars(); x++) { for (var x = 0; x < num_vars(); x++) {
@ -1905,13 +1945,6 @@ namespace nlsat {
if (bounds.empty()) if (bounds.empty())
break; break;
gc();
if (m_stats.m_restarts % 10 == 0) {
if (m_reordered)
restore_order();
apply_reorder();
}
init_search(); init_search();
IF_VERBOSE(2, verbose_stream() << "(nlsat-b&b :conflicts " << m_stats.m_conflicts IF_VERBOSE(2, verbose_stream() << "(nlsat-b&b :conflicts " << m_stats.m_conflicts
<< " :decisions " << m_stats.m_decisions << " :decisions " << m_stats.m_decisions
@ -2197,45 +2230,107 @@ namespace nlsat {
display_mathematica_lemma(out, core.size(), core.data(), true); display_mathematica_lemma(out, core.size(), core.data(), true);
return out; return out;
} }
void log_assignment_lemma_smt2(std::ostream& out, lazy_justification const & jst) {
// This lemma is written down only for debug purposes, it does not participate in the algorithm.
// We need to be sure that lazy certifacation is sound on the sample
// In this lemma we do not use literals created by projection
literal_vector core;
bool_vector used_vars(num_vars(), false);
bool_vector used_bools(usize(m_atoms), false);
var_vector vars;
for (unsigned i = 0; i < jst.num_lits(); ++i) {
literal lit = ~jst.lit(i);
core.push_back(lit);
bool_var b = lit.var();
if (b != null_bool_var && b < used_bools.size())
used_bools[b] = true;
vars.reset();
this->vars(lit, vars);
for (var v : vars)
used_vars[v] = true;
}
std::ostringstream comment;
bool any_var = false;
display_num_assignment(comment, &used_vars);
if (!any_var)
comment << " (none)";
comment << "; literals:";
if (jst.num_lits() == 0) {
comment << " (none)";
}
else {
for (unsigned i = 0; i < jst.num_lits(); ++i) {
comment << " ";
display(comment, jst.lit(i));
if (i < jst.num_lits() - 1)
comment << " /\\";
}
}
out << "(echo \"#" << m_lemma_count++ << ":assignment lemma " << comment.str() << "\")\n";
if (m_log_lemma_smtrat)
out << "(set-logic NRA)\n";
else
out << "(set-logic ALL)\n";
out << "(set-option :rlimit " << m_lemma_rlimit << ")\n";
display_smt2_bool_decls(out, used_bools);
display_smt2_arith_decls(out, used_vars);
display_bool_assignment(out, false, &used_bools);
display_num_assignment(out, &used_vars);
for (literal lit : core) {
literal asserted = ~lit;
bool is_root = asserted.var() != null_bool_var &&
m_atoms[asserted.var()] != nullptr &&
m_atoms[asserted.var()]->is_root_atom();
if (is_root) {
display_root_literal_block(out, asserted, m_display_var);
}
else {
out << "(assert ";
display_smt2(out, asserted);
out << ")\n";
}
}
out << "(check-sat)\n";
out << "(reset)\n";
}
void resolve_lazy_justification(bool_var b, lazy_justification const & jst) { void resolve_lazy_justification(bool_var b, lazy_justification const & jst) {
// ++ttt;
TRACE(nlsat_resolve, tout << "resolving lazy_justification for b" << b << "\n";); TRACE(nlsat_resolve, tout << "resolving lazy_justification for b" << b << "\n";);
unsigned sz = jst.num_lits(); unsigned sz = jst.num_lits();
// Dump lemma as Mathematica formula that must be true, // Dump lemma as Mathematica formula that must be true,
// if the current interpretation (really) makes the core in jst infeasible. // if the current interpretation, the sample, makes the core in jst infeasible.
TRACE(nlsat_mathematica, tout << "assignment lemma\n"; print_out_as_math(tout, jst);); TRACE(nlsat_mathematica,
if (m_dump_mathematica) { tout << "assignment lemma\n"; print_out_as_math(tout, jst) << "\n:assignment lemas as smt2\n";
// verbose_stream() << "assignment lemma in matematica\n"; log_assignment_lemma_smt2(tout, jst););
if (m_dump_mathematica)
print_out_as_math(verbose_stream(), jst) << std::endl; print_out_as_math(verbose_stream(), jst) << std::endl;
// verbose_stream() << "\nend of assignment lemma\n";
}
m_lazy_clause.reset(); m_lazy_clause.reset();
m_explain.main_operator(jst.num_lits(), jst.lits(), m_lazy_clause); m_explain.main_operator(jst.num_lits(), jst.lits(), m_lazy_clause);
for (unsigned i = 0; i < sz; i++) for (unsigned i = 0; i < sz; i++)
m_lazy_clause.push_back(~jst.lit(i)); m_lazy_clause.push_back(~jst.lit(i));
// lazy clause is a valid clause // lazy clause is a valid clause
TRACE(nlsat_mathematica, display_mathematica_lemma(tout, m_lazy_clause.size(), m_lazy_clause.data());); TRACE(nlsat_mathematica, tout << "ttt:" << m_lemma_count << "\n"; display_mathematica_lemma(tout, m_lazy_clause.size(), m_lazy_clause.data()););
if (m_dump_mathematica) { if (m_dump_mathematica)
// verbose_stream() << "lazy clause\n"; display_mathematica_lemma(std::cout, m_lazy_clause.size(), m_lazy_clause.data()) << std::endl;
display_mathematica_lemma(verbose_stream(), m_lazy_clause.size(), m_lazy_clause.data()) << std::endl;
// verbose_stream() << "\nend of lazy\n";
}
TRACE(nlsat_proof_sk, tout << "theory lemma\n"; display_abst(tout, m_lazy_clause.size(), m_lazy_clause.data()); tout << "\n";); TRACE(nlsat_proof_sk, tout << "theory lemma\n"; display_abst(tout, m_lazy_clause.size(), m_lazy_clause.data()); tout << "\n";);
TRACE(nlsat_resolve, TRACE(nlsat_resolve,
tout << "m_xk: " << m_xk << ", "; m_display_var(tout, m_xk) << "\n"; tout << "m_xk: " << m_xk << ", "; m_display_var(tout, m_xk) << "\n";
tout << "new valid clause:\n"; tout << "new valid clause:\n";
display(tout, m_lazy_clause.size(), m_lazy_clause.data()) << "\n";); display(tout, m_lazy_clause.size(), m_lazy_clause.data()) << "\n";);
if (m_log_lemmas) if (m_log_lemmas) {
log_lemma(verbose_stream(), m_lazy_clause.size(), m_lazy_clause.data(), true); log_assignment_lemma_smt2(std::cout, jst);
log_lemma(verbose_stream(), m_lazy_clause.size(), m_lazy_clause.data(), true, "conflict");
}
if (m_check_lemmas) { if (m_check_lemmas) {
check_lemma(m_lazy_clause.size(), m_lazy_clause.data(), false, nullptr); check_lemma(m_lazy_clause.size(), m_lazy_clause.data(), false, nullptr);
@ -2486,8 +2581,8 @@ namespace nlsat {
check_lemma(m_lemma.size(), m_lemma.data(), false, m_lemma_assumptions.get()); check_lemma(m_lemma.size(), m_lemma.data(), false, m_lemma_assumptions.get());
} }
if (m_log_lemmas) // if (m_log_lemmas)
log_lemma(verbose_stream(), m_lemma.size(), m_lemma.data(), false); // log_lemma(std::cout, m_lemma.size(), m_lemma.data(), false);
// There are two possibilities: // There are two possibilities:
// 1) m_lemma contains only literals from previous stages, and they // 1) m_lemma contains only literals from previous stages, and they
@ -2808,7 +2903,7 @@ namespace nlsat {
// verbose_stream() << "\npermutation: " << p[0] << " count " << count << " " << m_rlimit.is_canceled() << "\n"; // verbose_stream() << "\npermutation: " << p[0] << " count " << count << " " << m_rlimit.is_canceled() << "\n";
reinit_cache(); reinit_cache();
SASSERT(num_vars() == sz); SASSERT(num_vars() == sz);
TRACE(nlsat_bool_assignment_bug, tout << "before reset watches\n"; display_bool_assignment(tout);); TRACE(nlsat_bool_assignment_bug, tout << "before reset watches\n"; display_bool_assignment(tout, false, nullptr););
reset_watches(); reset_watches();
assignment new_assignment(m_am); assignment new_assignment(m_am);
for (var x = 0; x < num_vars(); x++) { for (var x = 0; x < num_vars(); x++) {
@ -2850,7 +2945,7 @@ namespace nlsat {
m_pm.rename(sz, p); m_pm.rename(sz, p);
for (auto& b : m_bounds) for (auto& b : m_bounds)
b.x = p[b.x]; b.x = p[b.x];
TRACE(nlsat_bool_assignment_bug, tout << "before reinit cache\n"; display_bool_assignment(tout);); TRACE(nlsat_bool_assignment_bug, tout << "before reinit cache\n"; display_bool_assignment(tout, false, nullptr););
reinit_cache(); reinit_cache();
m_assignment.swap(new_assignment); m_assignment.swap(new_assignment);
reattach_arith_clauses(m_clauses); reattach_arith_clauses(m_clauses);
@ -3261,9 +3356,34 @@ namespace nlsat {
// //
// ----------------------- // -----------------------
std::ostream& display_num_assignment(std::ostream & out, display_var_proc const & proc) const { std::ostream& display_num_assignment(std::ostream & out, display_var_proc const & proc, bool_vector const* used_vars = nullptr) const {
bool restrict = used_vars != nullptr;
for (var x = 0; x < num_vars(); x++) { for (var x = 0; x < num_vars(); x++) {
if (m_assignment.is_assigned(x)) { if (restrict && (x >= used_vars->size() || !(*used_vars)[x]))
continue;
if (!m_assignment.is_assigned(x))
continue;
if (restrict) {
out << "(assert (= ";
proc(out, x);
out << " ";
if (m_am.is_rational(m_assignment.value(x))) {
mpq q;
m_am.to_rational(m_assignment.value(x), q);
m_am.qm().display_smt2(out, q, false);
}
else if (m_log_lemma_smtrat) {
std::ostringstream var_name;
proc(var_name, x);
std::string name = var_name.str();
m_am.display_root_smtrat(out, m_assignment.value(x), name.c_str());
}
else {
m_am.display_root_smt2(out, m_assignment.value(x));
}
out << "))\n";
}
else {
proc(out, x); proc(out, x);
out << " -> "; out << " -> ";
m_am.display_decimal(out, m_assignment.value(x)); m_am.display_decimal(out, m_assignment.value(x));
@ -3273,8 +3393,21 @@ namespace nlsat {
return out; return out;
} }
std::ostream& display_bool_assignment(std::ostream & out, bool eval_atoms = false) const { std::ostream& display_bool_assignment(std::ostream & out, bool eval_atoms = false, bool_vector const* used = nullptr) const {
unsigned sz = usize(m_atoms); unsigned sz = usize(m_atoms);
if (used != nullptr) {
for (bool_var b = 0; b < sz; b++) {
if (b >= used->size() || !(*used)[b])
continue;
if (m_atoms[b] != nullptr)
continue;
lbool val = m_bvalues[b];
if (val == l_undef)
continue;
out << "(assert (= b" << b << " " << (val == l_true ? "true" : "false") << "))\n";
}
return out;
}
if (!eval_atoms) { if (!eval_atoms) {
for (bool_var b = 0; b < sz; b++) { for (bool_var b = 0; b < sz; b++) {
if (m_bvalues[b] == l_undef) if (m_bvalues[b] == l_undef)
@ -3319,13 +3452,13 @@ namespace nlsat {
return !first; return !first;
} }
std::ostream& display_num_assignment(std::ostream & out) const { std::ostream& display_num_assignment(std::ostream & out, const bool_vector* used_vars=nullptr) const {
return display_num_assignment(out, m_display_var); return display_num_assignment(out, m_display_var, used_vars);
} }
std::ostream& display_assignment(std::ostream& out, bool eval_atoms = false) const { std::ostream& display_assignment(std::ostream& out, bool eval_atoms = false) const {
display_bool_assignment(out, eval_atoms); display_bool_assignment(out, eval_atoms, nullptr);
display_num_assignment(out); display_num_assignment(out, nullptr);
return out; return out;
} }
@ -3529,44 +3662,93 @@ namespace nlsat {
} }
std::ostream& display_root_smt2(std::ostream& out, root_atom const& a, display_var_proc const& proc) const { std::ostream& display_root_term_smtrat(std::ostream& out, root_atom const& a, display_var_proc const& proc) const {
if (a.i() == 1 && m_pm.degree(a.p(), a.x()) == 1) out << "(root ";
return display_linear_root_smt2(out, a, proc); display_polynomial_smt2(out, a.p(), proc);
#if 1 out << " " << a.i() << " ";
proc(out, a.x());
out << ")";
return out;
}
std::ostream& display_root_atom_smtrat(std::ostream& out, root_atom const& a, display_var_proc const& proc) const {
char const* rel = "=";
switch (a.get_kind()) {
case atom::ROOT_LT: rel = "<"; break;
case atom::ROOT_GT: rel = ">"; break;
case atom::ROOT_LE: rel = "<="; break;
case atom::ROOT_GE: rel = ">="; break;
case atom::ROOT_EQ: rel = "="; break;
default: UNREACHABLE(); break;
}
out << "(" << rel << " ";
proc(out, a.x());
out << " ";
display_root_term_smtrat(out, a, proc);
out << ")";
return out;
}
struct root_poly_subst : public display_var_proc {
display_var_proc const& m_proc;
var m_var;
char const* m_name;
root_poly_subst(display_var_proc const& p, var v, char const* name):
m_proc(p), m_var(v), m_name(name) {}
std::ostream& operator()(std::ostream& dst, var x) const override {
if (x == m_var)
return dst << m_name;
return m_proc(dst, x);
}
};
template<typename Printer>
std::ostream& display_root_quantified(std::ostream& out, root_atom const& a, display_var_proc const& proc, Printer const& printer) const {
// if (a.i() == 1 && m_pm.degree(a.p(), a.x()) == 1)
// return display_linear_root_smt2(out, a, proc);
auto mk_y_name = [](unsigned j) {
return std::string("y") + std::to_string(j);
};
unsigned idx = a.i();
SASSERT(idx > 0);
out << "(exists ("; out << "(exists (";
for (unsigned j = 0; j < a.i(); ++j) { for (unsigned j = 0; j < idx; ++j) {
std::string y = std::string("y") + std::to_string(j); auto y = mk_y_name(j);
out << "(" << y << " Real) "; out << "(" << y << " Real) ";
} }
out << ")\n"; out << ")\n (and\n";
out << "(and\n";
for (unsigned j = 0; j < a.i(); ++j) { for (unsigned j = 0; j < idx; ++j) {
std::string y = std::string("y") + std::to_string(j); auto y = mk_y_name(j);
display_poly_root(out, y.c_str(), a, proc); out << " (= ";
} printer(out, y.c_str());
for (unsigned j = 0; j + 1 < a.i(); ++j) { out << " 0)\n";
std::string y1 = std::string("y") + std::to_string(j);
std::string y2 = std::string("y") + std::to_string(j+1);
out << "(< " << y1 << " " << y2 << ")\n";
} }
std::string yn = "y" + std::to_string(a.i() - 1); for (unsigned j = 0; j + 1 < idx; ++j) {
auto y1 = mk_y_name(j);
auto y2 = mk_y_name(j + 1);
out << " (< " << y1 << " " << y2 << ")\n";
}
// TODO we need (forall z : z < yn . p(z) => z = y1 or ... z = y_{n-1}) auto y0 = mk_y_name(0);
// to say y1, .., yn are the first n distinct roots. out << " (forall ((y Real)) (=> (< y " << y0 << ") (not (= ";
// printer(out, "y");
out << "(forall ((z Real)) (=> (and (< z " << yn << ") "; display_poly_root(out, "z", a, proc) << ") "; out << " 0))))\n";
if (a.i() == 1) {
out << "false))\n"; for (unsigned j = 0; j + 1 < idx; ++j) {
} auto y1 = mk_y_name(j);
else { auto y2 = mk_y_name(j + 1);
out << "(or "; out << " (forall ((y Real)) (=> (and (< " << y1 << " y) (< y " << y2 << ")) (not (= ";
for (unsigned j = 0; j + 1 < a.i(); ++j) { printer(out, "y");
std::string y1 = std::string("y") + std::to_string(j); out << " 0))))\n";
out << "(= z " << y1 << ") ";
}
out << ")))\n";
} }
std::string yn = mk_y_name(idx - 1);
out << " ";
switch (a.get_kind()) { switch (a.get_kind()) {
case atom::ROOT_LT: out << "(< "; proc(out, a.x()); out << " " << yn << ")"; break; case atom::ROOT_LT: out << "(< "; proc(out, a.x()); out << " " << yn << ")"; break;
case atom::ROOT_GT: out << "(> "; proc(out, a.x()); out << " " << yn << ")"; break; case atom::ROOT_GT: out << "(> "; proc(out, a.x()); out << " " << yn << ")"; break;
@ -3575,12 +3757,33 @@ namespace nlsat {
case atom::ROOT_EQ: out << "(= "; proc(out, a.x()); out << " " << yn << ")"; break; case atom::ROOT_EQ: out << "(= "; proc(out, a.x()); out << " " << yn << ")"; break;
default: UNREACHABLE(); break; default: UNREACHABLE(); break;
} }
out << "))"; out << "\n )\n)";
return out; return out;
#endif }
std::ostream& display_root_smt2(std::ostream& out, root_atom const& a, display_var_proc const& proc) const {
if (m_log_lemma_smtrat)
return display_root_atom_smtrat(out, a, proc);
auto inline_printer = [&](std::ostream& dst, char const* y) -> std::ostream& {
root_poly_subst poly_proc(proc, a.x(), y);
return display_polynomial_smt2(dst, a.p(), poly_proc);
};
return display_root_quantified(out, a, proc, inline_printer);
}
return display_root(out, a, proc); std::ostream& display_root_literal_block(std::ostream& out, literal lit, display_var_proc const& proc) const {
bool_var b = lit.var();
SASSERT(m_atoms[b] != nullptr && m_atoms[b]->is_root_atom());
auto const& a = *to_root_atom(m_atoms[b]);
out << "(assert ";
if (lit.sign())
out << "(not ";
display_root_smt2(out, a, proc);
if (lit.sign())
out << ")";
out << ")\n";
return out;
} }
std::ostream& display_root(std::ostream & out, root_atom const & a, display_var_proc const & proc) const { std::ostream& display_root(std::ostream & out, root_atom const & a, display_var_proc const & proc) const {
@ -3998,31 +4201,51 @@ namespace nlsat {
return m_display_var(out, j); return m_display_var(out, j);
} }
std::ostream& display_smt2_arith_decls(std::ostream & out) const { std::ostream& display_smt2_arith_decls(std::ostream & out, bool_vector& used_vars) const {
unsigned sz = m_is_int.size(); unsigned sz = m_is_int.size();
for (unsigned i = 0; i < sz; i++) { for (unsigned i = 0; i < sz; i++) {
if (is_int(i)) { if (!used_vars[i]) continue;
out << "(declare-fun "; m_display_var(out, i) << " () Int)\n"; out << "(declare-fun ";
m_display_var(out, i);
out << " () ";
if (!m_log_lemma_smtrat && is_int(i)) {
out << "Int";
} }
else { else {
out << "(declare-fun "; m_display_var(out, i) << " () Real)\n"; out << "Real";
} }
out << ")\n";
} }
return out; return out;
} }
std::ostream& display_smt2_bool_decls(std::ostream & out) const { std::ostream& display_smt2_bool_decls(std::ostream & out, const bool_vector& used_bools) const {
unsigned sz = usize(m_atoms); unsigned sz = usize(m_atoms);
for (unsigned i = 0; i < sz; i++) { for (unsigned i = 0; i < sz; i++) {
if (m_atoms[i] == nullptr) if (m_atoms[i] == nullptr && used_bools[i])
out << "(declare-fun b" << i << " () Bool)\n"; out << "(declare-fun b" << i << " () Bool)\n";
} }
return out; return out;
} }
std::ostream& display_smt2(std::ostream & out) const { std::ostream& display_smt2(std::ostream & out) const {
display_smt2_bool_decls(out); bool_vector used_vars(num_vars(), false);
display_smt2_arith_decls(out); bool_vector used_bools(usize(m_atoms), false);
var_vector vars;
for (clause* c: m_clauses) {
for (literal lit : *c) {
bool_var b = lit.var();
if (b != null_bool_var && b < used_bools.size())
used_bools[b] = true;
vars.reset();
this->vars(lit, vars);
for (var v : vars)
used_vars[v] = true;
}
}
display_smt2_bool_decls(out, used_bools);
display_smt2_arith_decls(out, used_vars);
out << "(assert (and true\n"; out << "(assert (and true\n";
for (clause* c : m_clauses) { for (clause* c : m_clauses) {
display_smt2(out, *c, m_display_var) << "\n"; display_smt2(out, *c, m_display_var) << "\n";

View file

@ -589,22 +589,6 @@ public:
--m_correction_set_size; --m_correction_set_size;
} }
trace(); trace();
bool no_hidden_soft = (m_st == s_primal_dual || m_st == s_primal || m_st == s_primal_binary);
if (no_hidden_soft && m_c.num_objectives() == 1 && m_pivot_on_cs && m_csmodel.get() && m_correction_set_size < core.size()) {
exprs cs;
get_current_correction_set(m_csmodel.get(), cs);
m_correction_set_size = cs.size();
TRACE(opt, tout << "cs " << m_correction_set_size << " " << core.size() << "\n";);
if (m_correction_set_size >= core.size())
return;
rational w(0);
for (expr* a : m_asms) {
rational w1 = m_asm2weight[a];
if (w != 0 && w1 != w) return;
w = w1;
}
process_sat(cs);
}
} }
bool get_mus_model(model_ref& mdl) { bool get_mus_model(model_ref& mdl) {

View file

@ -20,6 +20,7 @@ Notes:
#include "util/gparams.h" #include "util/gparams.h"
#include "ast/for_each_expr.h" #include "ast/for_each_expr.h"
#include "ast/ast_pp.h" #include "ast/ast_pp.h"
#include "ast/ast_translation.h"
#include "ast/bv_decl_plugin.h" #include "ast/bv_decl_plugin.h"
#include "ast/pb_decl_plugin.h" #include "ast/pb_decl_plugin.h"
#include "ast/ast_smt_pp.h" #include "ast/ast_smt_pp.h"
@ -155,6 +156,57 @@ namespace opt {
reset_maxsmts(); reset_maxsmts();
} }
context* context::translate(ast_manager& target_m) {
// Create AST translator
ast_translation translator(m, target_m);
// Create new context in target manager
context* result = alloc(context, target_m);
// Copy parameters
result->updt_params(m_params);
// Set logic
if (m_logic != symbol::null) {
result->set_logic(m_logic);
}
// Translate hard constraints from scoped state
for (expr* e : m_scoped_state.m_hard) {
result->add_hard_constraint(translator(e));
}
// Translate objectives
for (auto const& obj : m_scoped_state.m_objectives) {
if (obj.m_type == O_MAXIMIZE || obj.m_type == O_MINIMIZE) {
// Translate maximize/minimize objectives
app_ref translated_term(to_app(translator(obj.m_term.get())), target_m);
result->add_objective(translated_term, obj.m_type == O_MAXIMIZE);
}
else if (obj.m_type == O_MAXSMT) {
// Translate soft constraints for MaxSMT objectives
for (unsigned i = 0; i < obj.m_terms.size(); ++i) {
result->add_soft_constraint(
translator(obj.m_terms.get(i)),
obj.m_weights[i],
obj.m_id
);
}
}
}
// Copy configuration flags
result->m_enable_sat = m_enable_sat;
result->m_enable_sls = m_enable_sls;
result->m_is_clausal = m_is_clausal;
result->m_pp_neat = m_pp_neat;
result->m_pp_wcnf = m_pp_wcnf;
result->m_incremental = m_incremental;
result->m_maxsat_engine = m_maxsat_engine;
return result;
}
void context::reset_maxsmts() { void context::reset_maxsmts() {
for (auto& kv : m_maxsmts) { for (auto& kv : m_maxsmts) {
dealloc(kv.m_value); dealloc(kv.m_value);
@ -406,6 +458,7 @@ namespace opt {
void context::set_model(model_ref& m) { void context::set_model(model_ref& m) {
m_model = m; m_model = m;
m_model_available = true;
opt_params optp(m_params); opt_params optp(m_params);
symbol prefix = optp.solution_prefix(); symbol prefix = optp.solution_prefix();
bool model2console = optp.dump_models(); bool model2console = optp.dump_models();
@ -438,6 +491,8 @@ namespace opt {
void context::get_model_core(model_ref& mdl) { void context::get_model_core(model_ref& mdl) {
if (!m_model_available)
throw default_exception("model is not available");
mdl = m_model; mdl = m_model;
CTRACE(opt, mdl, tout << *mdl;); CTRACE(opt, mdl, tout << *mdl;);
fix_model(mdl); fix_model(mdl);
@ -1678,6 +1733,7 @@ namespace opt {
m_model.reset(); m_model.reset();
m_model_fixed.reset(); m_model_fixed.reset();
m_core.reset(); m_core.reset();
m_model_available = false;
} }
void context::set_pareto(pareto_base* p) { void context::set_pareto(pareto_base* p) {

View file

@ -186,7 +186,8 @@ namespace opt {
map_t m_maxsmts; map_t m_maxsmts;
scoped_state m_scoped_state; scoped_state m_scoped_state;
vector<objective> m_objectives; vector<objective> m_objectives;
model_ref m_model; model_ref m_model;
bool m_model_available = false;
model_converter_ref m_model_converter; model_converter_ref m_model_converter;
generic_model_converter_ref m_fm; generic_model_converter_ref m_fm;
sref_vector<model> m_model_fixed; sref_vector<model> m_model_fixed;
@ -209,6 +210,13 @@ namespace opt {
public: public:
context(ast_manager& m); context(ast_manager& m);
~context() override; ~context() override;
/**
* \brief Create a clone of the optimization context in a different ast_manager.
* Translates all assertions, objectives, and solver state.
*/
context* translate(ast_manager& target_m);
unsigned add_soft_constraint(expr* f, rational const& w, symbol const& id); unsigned add_soft_constraint(expr* f, rational const& w, symbol const& id);
unsigned add_objective(app* t, bool is_max); unsigned add_objective(app* t, bool is_max);
void add_hard_constraint(expr* f); void add_hard_constraint(expr* f);

View file

@ -75,15 +75,6 @@ def_module_params('sat',
('anf', BOOL, False, 'enable ANF based simplification in-processing'), ('anf', BOOL, False, 'enable ANF based simplification in-processing'),
('anf.delay', UINT, 2, 'delay ANF simplification by in-processing round'), ('anf.delay', UINT, 2, 'delay ANF simplification by in-processing round'),
('anf.exlin', BOOL, False, 'enable extended linear simplification'), ('anf.exlin', BOOL, False, 'enable extended linear simplification'),
('cut', BOOL, False, 'enable AIG based simplification in-processing'),
('cut.delay', UINT, 2, 'delay cut simplification by in-processing round'),
('cut.aig', BOOL, False, 'extract aigs (and ites) from cluases for cut simplification'),
('cut.lut', BOOL, False, 'extract luts from clauses for cut simplification'),
('cut.xor', BOOL, False, 'extract xors from clauses for cut simplification'),
('cut.npn3', BOOL, False, 'extract 3 input functions from clauses for cut simplification'),
('cut.dont_cares', BOOL, True, 'integrate dont cares with cuts'),
('cut.redundancies', BOOL, True, 'integrate redundancy checking of cuts'),
('cut.force', BOOL, False, 'force redoing cut-enumeration until a fixed-point'),
('lookahead.cube.cutoff', SYMBOL, 'depth', 'cutoff type used to create lookahead cubes: depth, freevars, psat, adaptive_freevars, adaptive_psat'), ('lookahead.cube.cutoff', SYMBOL, 'depth', 'cutoff type used to create lookahead cubes: depth, freevars, psat, adaptive_freevars, adaptive_psat'),
# - depth: the maximal cutoff is fixed to the value of lookahead.cube.depth. # - depth: the maximal cutoff is fixed to the value of lookahead.cube.depth.
# So if the value is 10, at most 1024 cubes will be generated of length 10. # So if the value is 10, at most 1024 cubes will be generated of length 10.

View file

@ -20,9 +20,9 @@ def_module_params('smt_parallel',
('explicit_hardness', BOOL, False, 'use explicit hardness metric for cube'), ('explicit_hardness', BOOL, False, 'use explicit hardness metric for cube'),
('cubetree', BOOL, False, 'use cube tree data structure for storing cubes'), ('cubetree', BOOL, False, 'use cube tree data structure for storing cubes'),
('searchtree', BOOL, False, 'use search tree implementation (parallel2)'), ('searchtree', BOOL, False, 'use search tree implementation (parallel2)'),
('inprocessing', BOOL, False, 'integrate in-processing as a heuristic simplification'), ('inprocessing', BOOL, True, 'integrate in-processing as a heuristic simplification'),
('inprocessing_delay', UINT, 0, 'number of undef before invoking simplification'), ('inprocessing_delay', UINT, 0, 'number of undef before invoking simplification'),
('param_tuning', BOOL, False, 'whether to tune params online during solving'), ('param_tuning', BOOL, False, 'whether to tune params online during solving'),
('enable_parallel_smt', BOOL, True, 'whether to run the parallel solver (set to FALSE to test param tuning only)'), ('enable_parallel_smt', BOOL, True, 'whether to run the parallel solver (set to FALSE to test param tuning only)'),
('tunable_params', STRING, '', 'comma-separated key=value list for online param tuning, e.g. \"smt.arith.nl.horner=false,smt.arith.nl.delay=8\"') ('tunable_params', STRING, '', 'comma-separated key=value list for online param tuning, e.g. \\"smt.arith.nl.horner=false,smt.arith.nl.delay=8\\"')
)) ))

Some files were not shown because too many files have changed in this diff Show more