diff --git a/Notes-on-laziness b/Notes-on-laziness
new file mode 100644
index 0000000000000000000000000000000000000000..e4e094589e427421ffb198b042829afddc687fb7
--- /dev/null
+++ b/Notes-on-laziness
@@ -0,0 +1,140 @@
+Some notes:
+------------
+
+A fundamental assumption: 
+--------------------------
+All functions in the programs must terminate when lazy invocations are treated as eager invocation.
+Otherwise, we report that there could be infinite streams.
+
+Need to Prove
+-------------
+How will this enforce the termination of all functions after the translation ?
+This will also ensure that all streams in the program are finite.
+
+Claim 1:
+--------
+Since all closures are created by the functions in the program, we have a global
+invariant that whenever we force a value the resulting value cannot be the same 
+as the original value. 
+Proof: 
+(a) since the value is forced it must be a closure with some arguments
+(b) if we get back an identical closure with the same arguments then it means
+we have a recursion the function that corresponds to the the lazy closure,
+and the recursive calls happens with the same set of arguments.
+Which means it is non-terminating when lazy is treated as eager
+
+Claim 2:
+--------
+When we force a value it creates some more closures, one of which could again be forced
+returning another closure and so on. Let R(C, n) denote a predicate that 
+is true if after forcing closure C, and a closure resulting from C and so on 'n' 
+times, we still have a closure.
+We have a invaraint that: \exists n \in nat s.t. ~R(C, n).
+
+Alternatively, we can show that every cycle in the transformed program corresponds to a cycle in the
+input program, but this may also be hard. Since cycle could be offesetted as the recursive 
+calls are invoked at different points in the program.
+
+
+Important regarding termination
+--------------------------------
+Despite this, in the translation the functions may not terminate on all inputs since there is global
+invariants are not enforced. So how do we make sure that all functions after 
+translation do terminate.
+
+One way to handle this would be to change the model and create a new free variable for the result and 
+then relate the result and the function in eval, which is like an assumption on the input (a precondition).
+(We know that the precondition is satisfiable because it is for the data-structures created in the program.)
+
+However, we somehow need unlimited supply of free variable of a type. 
+To do that we can have a free-list of free-variables as arguments 
+(similar to the oracle strategy for non-determinism).
+For eg. sealed abstract class FreeList
+	case class Nil() FreeList
+	case class Data(fl: FreeList) extends FreeList // returns the data component
+	case class Next(fl: FreeList) extends FreeList
+
+define a function 'nextFree' : FreeList -> FreeList. 
+var count = 0
+nextFree(l) =
+  count += 1
+  Data(l :/ (1 to count - 1){ (acc, _) => Next(l) })
+
+Everytime we call a function within the body we case use nextFree(l), where l is the input free-list,
+  to get a new free-list.
+We can now add uninterpreted functions that would map the free-list to a desired value,
+for instance, to an ui result of a required type, or to a fresh-integer that denotes 
+fresh-ids.
+
+
+Important: Reasons for unsoundness
+--------------------------------------
+
+a) the assertion on the newC function that the new closure is unevaluated may result in unsoundness
+	- fix: create new identifiers at the time of creating functions. We can also try and use epsilons.
+
+b) Normal data-types can help ensure acyclicity and finiteness of stream but that means we need to create 
+   free variables at the time of creation
+	currently, we are creating axioms that state acyclicity.
+
+Benchmarks to think about
+--------------------------
+
+1. Lazy constraint solving 	
+	We can get demand driven pointer-analysis. Can we prove its running time ?
+
+2. Packrat parsers uses memoization
+	Can we prove its running time ?
+
+3. ConcTrees full version ?
+
+4. Binomial heaps ?
+
+5. Futures: How about redex algorithms or map-reduce algorithms ?
+
+Things to implement:
+----------------------
+
+a) A way to check monotonicity of preconditions
+b) Use monotonicity while checking proofs
+c) Automatically synthesize the invariant that the state is preserved
+d) We can always assume that state is monotonic which proving the specs
+c) Extensions for futures 
+d) Extensions for memoization
+e) Extending Orb for type parameters
+*f) Try to separate the instrumentation for state and value, if not add a postcondition
+   that the value part of the state-instrumented program is same as some function that does not use state.
+   But, how can we do that ?
+
+Integration
+
+Try to fix bugs in CVC4 proof and also try to integerate it with the system
+Try to integrate the tool with Orb
+
+Some Notes
+----------
+
+a) It is very essential to make the separation between value and state component explicit
+	we can use something like: "foo(x, uiState())" and "foo(x, st)" where st is the actual input state.
+
+b) We need to treat Lazyarg's as not closures. That is, they don't change state and are not considered as Unevaluated. They
+	are never added to the set. 
+	We can have an implicit type conversion from normal values to lazyargs. 
+	Since normal values are eagerly computed. This lets us specify lazyargs succinctly in the code.
+	We need to change isEvaluated and value function call modelling
+
+c) Futures when modelled as closures can be stored and then later joined. 
+e.g. a function  'spawn' that will spawn tasks and a function 'join' that will join 
+the spawned threads.
+However, we need to explicitly mark the functions that can be executed in parallel from the 
+point of creation of a Future to the point of joining the Future.
+Eg. as 
+spawn()
+spawn()
+parallel(op(), join(), join())
+//this means op() and the two futures will execute in parallel.
+
+d) Memoization: They are evaluated at the time of creation unlike lazy closures. 
+		We need to track whether a closure belongs to a state or not.
+
+
diff --git a/TODO b/TODO
deleted file mode 100644
index cd7d57407fb4bcd2edf86af9bdc034db95ffad0a..0000000000000000000000000000000000000000
--- a/TODO
+++ /dev/null
@@ -1,75 +0,0 @@
-TODO: 
------
-
-Important: Fix reasons for unsoundness
---------------------------------------
-
-a) eval* will use an empty set instead of an arbitrary which may result in unsoundness 
-   if the specs involve function over states
-	- fix: make the state argument a free variable (by possibly calling an uninterpreted function with zero arguments)
-
-b) the assertion on the newC function that the new closure is unevaluated may result in unsoundness
-	- fix: create new identifiers at the time of creating functions. We can also try and use epsilons.
-
-c) Coinductive (or normal) data-types can help ensure acyclicity of stream but that means we need to create 
-   free variables at the time of creation
-	currently, we are creating axioms that state acyclicity.
-
-Benchmarks to think about
---------------------------
-
-1. Lazy constraint solving 	
-	We can get demand driven pointer-analysis. Can we prove its running time ?
-
-2. Packrat parsers uses memoization
-	Can we prove its running time ?
-
-3. ConcTrees full version ?
-
-4. Binomial heaps ?
-
-5. Futures: How about redex algorithms or map-reduce algorithms ?
-
-Things to implement:
-----------------------
-
-a) A way to check monotonicity of preconditions
-b) Use monotonicity while checking proofs
-c) Automatically synthesize the invariant that the state is preserved
-d) We can always assume that state is monotonic which proving the specs
-c) Extensions for futures 
-d) Extensions for memoization
-e) Extending Orb for type parameters
-
-Integration
-
-Try to fix bugs in CVC4 proof and also try to integerate it with the system
-Try to integrate the tool with Orb
-
-Some Notes
-----------
-
-a) It is very essential to make the separation between value and state component explicit
-	we can use something like: "foo(x, uiState())" and "foo(x, st)" where st is the actual input state.
-
-b) We need to treat Lazyarg's as not closures. That is, they don't change state and are not considered as Unevaluated. They
-	are never added to the set. 
-	We can have an implicit type conversion from normal values to lazyargs. 
-	Since normal values are eagerly computed. This lets us specify lazyargs succinctly in the code.
-	We need to change isEvaluated and value function call modelling
-
-c) Futures when modelled as closures can be stored and then later joined. 
-e.g. a function  'spawn' that will spawn tasks and a function 'join' that will join 
-the spawned threads.
-However, we need to explicitly mark the functions that can be executed in parallel from the 
-point of creation of a Future to the point of joining the Future.
-Eg. as 
-spawn()
-spawn()
-parallel(op(), join(), join())
-//this means op() and the two futures will execute in parallel.
-
-d) Memoization: They are evaluated at the time of creation unlike lazy closures. 
-		We need to track whether a closure belongs to a state or not.
-
-
diff --git a/leon.out/Conqueue-edited-simplified.scala b/leon.out/Conqueue-edited-simplified.scala
deleted file mode 100644
index 33b1c10755812ef15be83065aac9591ae8d9f449..0000000000000000000000000000000000000000
--- a/leon.out/Conqueue-edited-simplified.scala
+++ /dev/null
@@ -1,520 +0,0 @@
-package lazybenchmarks
-import leon.lazyeval._
-import leon.lang._
-import leon.annotation._
-import leon.instrumentation._
-import leon.lang.synthesis._
-import ConcTrees._
-import ConQ._
-import Conqueue._
-
-object ConcTrees {
-  abstract class Conc[T] { 
-     def isEmpty : Boolean = this == Empty[T]()
-  
-	  def level : BigInt =  {
-	    this match {
-	      case Empty() =>
-		BigInt(0)
-	      case Single(x) =>
-		BigInt(0)
-	      case CC(l, r) =>
-		BigInt(1) + max(l.level, r.level)
-	    }
-	  } ensuring {
-	    (x$1 : BigInt) => x$1 >= BigInt(0)
-	  }
-  }
-  
-  case class Empty[T]() extends Conc[T]
-  
-  case class Single[T](x : T) extends Conc[T]
-  
-  case class CC[T](left : Conc[T], right : Conc[T]) extends Conc[T]
-  
-  def max(x : BigInt, y : BigInt): BigInt = if (x >= y) {
-    x
-  } else {
-    y
-  }
-  
-  def abs(x : BigInt): BigInt = if (x < BigInt(0)) {
-    -x
-  } else {
-    x
-  }
-}
-
-// TODO: change the model so that Lazyargs never appear in isEvaluated or they update state.
-object Conqueue {
-  abstract class ConQ[T] { 
-    def isSpine : Boolean = this match {
-	    case Spine(_, _) =>
-	      true
-	    case _ =>
-	      false
-	  }
-  def isTip : Boolean = !this.isSpine
-  }
-  
-  case class Tip[T](t : Conc[T]) extends ConQ[T]
-  
-  case class Spine[T](head : Conc[T], rear : LazyConQ[T]) extends ConQ[T]
-  
-  abstract class Scheds[T]
-  
-  case class Cons[T](h : LazyConQ[T], tail : Scheds[T]) extends Scheds[T]
-  
-  case class Nil[T]() extends Scheds[T]
-  
-  case class Wrapper[T](queue : LazyConQ[T], schedule : Scheds[T]) {
-	def valid(st : Set[LazyConQ[T]]): Boolean = zeroPreceedsLazy[T](this.queue, st) && schedulesProperty(this.queue, this.schedule, st)
-  } 
-  
-/*def zeroPreceedsLazy[T](q : LazyConQ[T], st : Set[LazyConQ[T]]): Boolean = {  
-    evalLazyConQS[T](q) match {
-      case Spine(h, rear) =>
-	if (st.contains(q)) {
-	      if(h == Empty[T]()) true
-	      else
-		zeroPreceedsLazy[T](rear, st)
-	} else false      
-      case Tip(_) => true      
-    } 
-}*/
-
-	def zeroPreceedsLazy[T](q : LazyConQ[T], st : Set[LazyConQ[T]]): Boolean = {  
-	  if (st.contains(q)) {
-	    evalLazyConQS[T](q) match {      
-	      case Spine(Empty(), rear) => true
-	      case Spine(_, rear) =>
-		zeroPreceedsLazy[T](rear, st)	
-	      case Tip(_) => true
-	    } 
-	  } else false      
-	}
-
-	def zeroPredLazyMonotone[T](st1 : Set[LazyConQ[T]], st2: Set[LazyConQ[T]], q: LazyConQ[T]) : Boolean = {
-	  require(st1.subsetOf(st2) && zeroPreceedsLazy(q, st1))
-	  zeroPreceedsLazy(q, st2) && 
-	  //induction scheme 
-	  (evalLazyConQS[T](q) match {
-	      case Spine(Empty(), _) =>
-		true
-	      case Spine(h, rear) =>
-		zeroPredLazyMonotone(st1, st2, rear)
-	      case Tip(_) =>
-		true
-	    })
-	} holds
-
-  def weakZeroPreceedsLazy[T](q : LazyConQ[T], st : Set[LazyConQ[T]]): Boolean = {  
-    evalLazyConQS[T](q) match {
-      case Spine(Empty(), rear10) =>
-        true
-      case Spine(h, rear11) =>
-        zeroPreceedsLazy[T](rear11, st)
-      case Tip(_) =>
-        true
-    }} 
-  
-  @library   
-  def streamLemma[T](l : LazyConQ[T], st : Set[LazyConQ[T]]): Boolean =  {
-    st.contains(l) || (evalLazyConQS[T](l) match {
-      case Spine(_, tail14) =>
-        l != tail14 && !st.contains(tail14)
-      case _ =>
-        true
-    })
-  } ensuring {
-    (holds : Boolean) => holds
-  }
-  
-  def firstUnevaluated[T](l : LazyConQ[T], st : Set[LazyConQ[T]]): LazyConQ[T] =  {
-    if (st.contains(l)) {
-      evalLazyConQS[T](l) match {
-        case Spine(_, tail15) =>
-          firstUnevaluated[T](tail15, st)
-        case _ =>
-          l
-      }
-    } else {
-      l
-    }
-  } ensuring {
-    (res65 : LazyConQ[T]) => {
-      val dres4 = evalLazyConQ[T](res65, st)
-      (evalLazyConQS[T](res65).isTip || !st.contains(res65)) && streamLemma[T](res65, st) && (dres4._1 match {
-        case Spine(_, tail16) =>
-          ((firstUnevaluated[T](l, dres4._2) == tail16), dres4._2)
-        case _ =>
-          (true, dres4._2)
-      })._1
-    }
-  }
-  
-  def schedulesProperty[T](q : LazyConQ[T], schs : Scheds[T], st : Set[LazyConQ[T]]): Boolean = {
-	val x = firstUnevaluated(q, st)
-	val y = evalLazyConQS(x) 
-	  schs match {    
-	    case Cons(head5, tail) =>  	
-	      y.isSpine && x == head5 && schedulesProperty[T](pushUntilZero[T](head5), tail, st) && 
-		weakZeroPreceedsLazy(head5, st)    
-	    case Nil() => 	     
-		y.isTip
-	  }
-  }
-  
-  def pushUntilZero[T](q : LazyConQ[T]): LazyConQ[T] = evalLazyConQS[T](q) match {
-    case Spine(Empty(), rear12) =>
-      pushUntilZero[T](rear12)
-    case Spine(h, rear13) =>
-      rear13
-    case Tip(_) =>
-      q
-  }
-  
-  def pushLeft[T](ys : Single[T], xs : LazyConQ[T], st : Set[LazyConQ[T]]): (ConQ[T], Set[LazyConQ[T]]) = {
-    require(zeroPreceedsLazy[T](xs, st) && ys.isInstanceOf[Single[T]])
-    val dres5 = evalLazyConQ[T](xs, st)
-    dres5._1 match {
-      case Tip(CC(_, _)) =>
-        (Spine[T](ys, xs), dres5._2)
-      case Tip(Empty()) =>
-        (Tip[T](ys), dres5._2)
-      case Tip(t @ Single(_)) =>
-        (Tip[T](CC[T](ys, t)), dres5._2)
-      case s @ Spine(Empty(), rear) =>
-	(Spine[T](ys, rear), dres5._2) // in this case firstUnevaluated[T](rear, st) == firstUnevaluated[T](xs, st)
-      case s @ Spine(_, _) =>
-        pushLeftLazy[T](ys, xs, dres5._2)
-    }
-  } ensuring { res => 
-	true
-    }
-
-  def dummyFun[T]() =  ???[(ConQ[T], Set[LazyConQ[T]])]
-
-  @library
-  def pushLeftLazyUI[T](ys : Conc[T], xs : LazyConQ[T], st : Set[LazyConQ[T]]): (ConQ[T], Set[LazyConQ[T]]) = {
-	dummyFun()
-  } ensuring(res => res._2 == st && (res._1 match {
-          case Spine(Empty(), rear) =>
-	      (evalLazyConQS[T](rear).isSpine && !st.contains(rear)) &&
-              /*(firstUnevaluated[T](pushUntilZero[T](rear), st) == firstUnevaluated[T](xs, st) ||
-				      evalLazyConQS[T](firstUnevaluated[T](xs, st)).isTip) &&
-			//evalLazyConQS[T](firstUnevaluated[T](pushUntilZero[T](rear), st)).isTip)*/
-	    {
-		val p = pushUntilZero[T](rear)
-		val f = firstUnevaluated[T](xs, st)
-		((evalLazyConQS[T](f).isTip && 
-			evalLazyConQS[T](firstUnevaluated[T](p, st)).isTip) ||
-			firstUnevaluated[T](p, st) == f)			
-	    } && 
-	    weakZeroPreceedsLazy(rear, st)
-          case _ => false	      
-            //firstUnevaluated[T](rear, st) == firstUnevaluated[T](xs, st) 		
-        }))
-
-  //@library
-  def pushLeftLazy[T](ys : Conc[T], xs : LazyConQ[T], st : Set[LazyConQ[T]]): (ConQ[T], Set[LazyConQ[T]]) =  {
-    require(!ys.isEmpty && zeroPreceedsLazy[T](xs, st) && 
-		(evalLazyConQS[T](xs) match { 
-		  case Spine(h, _) => h != Empty[T]()
-		  case _ => false
-		}))
-    val dres = evalLazyConQ[T](xs, st)
-    dres._1 match {
-      case Spine(head, rear15) =>
-        val carry = CC[T](head, ys)
-        val dres2 = evalLazyConQ[T](rear15, dres._2)
-        dres2._1 match {
-	  case s @ Spine(Empty(), _) =>
-	    (Spine[T](Empty[T](), newConQ(Lazyarg1[T](Spine(carry, rear15)), dres2._2)), dres2._2)
-          case s @ Spine(_, _) =>
-            (Spine[T](Empty[T](), newConQ[T](PushLeftLazy[T](carry, rear15), dres2._2)), dres2._2)
-          case t @ Tip(tree) =>
-	    if (tree.level > carry.level) { // can this happen ? this means tree is of level at least two greater than rear ?
-              val x : ConQ[T] = t
-              val y : ConQ[T] = Spine[T](carry, rear) //newConQ[T](Lazyarg1[T](x), dres._2))
-              (Spine[T](Empty[T](), newConQ[T](Lazyarg1[T](y), dres2._2)), dres2._2)
-            } else {// here tree level and carry level are equal
-              val x : ConQ[T] = Tip[T](CC[T](tree, carry))
-              val y : ConQ[T] = Spine[T](Empty[T](), newConQ[T](Lazyarg1[T](x), dres._2))
-              (Spine[T](Empty[T](), newConQ[T](Lazyarg1[T](y), dres2._2)), dres2._2)	
-            }
-        }
-    }
-  } ensuring {
-    (res66 : (Spine[T], Set[LazyConQ[T]])) => (res66._2 == st) && (res66._1 match {
-      case Spine(Empty(), rear) =>
-	     val (rearval, _) = evalLazyConQ[T](rear, st) // this is necessary to assert properties on the state in the recurisve invocation
-	     rearval.isSpine && !st.contains(rear) && {
-		val p = pushUntilZero[T](rear)
-		val f = firstUnevaluated[T](xs, st)
-		((evalLazyConQS[T](f).isTip && 
-			evalLazyConQS[T](firstUnevaluated[T](p, st)).isTip) ||
-			firstUnevaluated[T](p, st) == f) 
-	    } &&
-	    weakZeroPreceedsLazy(rear, st)
-      case _ =>
-	false        
-    })
-  }
-  
-  /*def PushLeftLazypushLeftLazyLem[T](rear15 : LazyConQ[T], head : Conc[T], dres : (ConQ[T], Set[LazyConQ[T]]), st : Set[LazyConQ[T]], xs : LazyConQ[T], s : Spine[T], dres : (ConQ[T], Set[LazyConQ[T]]), carry : CC[T], ys : Conc[T]): Boolean =  {
-    (!ys.isEmpty && zeroPreceedsLazy[T](xs, st) && evalLazyConQS[T](xs).isSpine && dres == evalLazyConQ[T](xs, st) && (!dres._1.isInstanceOf[Spine[T]] || !dres._1.head.isInstanceOf[Empty[T]]) && dres._1.isInstanceOf[Spine[T]] && head == dres._1.head && rear15 == dres._1.rear && carry == CC[T](head, ys) && dres == evalLazyConQ[T](rear15, dres._2) && (!dres._1.isInstanceOf[Spine[T]] || !dres._1.head.isInstanceOf[Empty[T]]) && dres._1.isInstanceOf[Spine[T]] && s == dres._1) ==> (!carry.isEmpty && zeroPreceedsLazy[T](rear15, dres._2) && evalLazyConQS[T](rear15).isSpine)
-  } ensuring {
-    (holds : Boolean) => holds
-  }*/
-
-  def streamContains[T](l: LazyConQ[T], newl: LazyConQ[T]) : Boolean = {
-    (l == newl) || (evalLazyConQS[T](l) match {
-       case Spine(_ , tail) => 
-          streamContains(tail, newl)
-        case _ => false
-     })
-  }
-  
-  // monotonicity of fune
-  def funeMonotone[T](st1 : Set[LazyConQ[T]], st2 : Set[LazyConQ[T]], l : LazyConQ[T], newl: LazyConQ[T]) : Boolean = {   
-    require(st2 == st1 ++ Set(newl) && 
-      !streamContains(l, newl))
-    (firstUnevaluated(l, st1) == firstUnevaluated(l, st2)) && //property
-	 //induction scheme
-      (evalLazyConQS[T](l) match {
-        case Spine(_, tail) =>	  
-          	funeMonotone(st1, st2, tail, newl)
-        case _ =>          
-		        true
-      })    
-  } holds
-
-  // isConcrete monotonicity
-  // def concreteMonotone[T](st1 : Set[LazyConQ[T]], st2 : Set[LazyConQ[T]], l : LazyConQ[T]) : Boolean = {   
-  //   ((isConcrete(l, st1) && st1.subsetOf(st2)) ==> isConcrete(l, st2)) && {
-  //     // induction scheme
-  //     evalLazyConQS[T](l) match {
-  //       case Spine(_, tail) =>
-  //         concreteMonotone[T](st1, st2, tail)
-  //       case _ =>
-  //         true
-  //     }
-  //   }
-  // } holds
-
-  @library // To be proven
-  def schedMonotone[T](st1: Set[LazyConQ[T]], st2 : Set[LazyConQ[T]], scheds: Scheds[T], l : LazyConQ[T], newl: LazyConQ[T]) : Boolean = {  
-    require((st2 == st1 ++ Set(newl)) && 
-      !streamContains(l, newl) && // newl is not contained in 'l'
-      schedulesProperty(l, scheds, st1)
-      ) 
-    //concreteMonotone(st1, st2, l) && 
-    zeroPredLazyMonotone(st1, st2, l) &&
-    funeMonotone(st1, st2, l, newl) &&   //instantiations   
-       schedulesProperty(l, scheds, st2) &&  //property
-       //induction scheme
-       (scheds match {
-          case Cons(head, tail) => 
-            schedMonotone(st1, st2, tail, pushUntilZero(head), newl)
-          case Nil() => true
-        })      
-  } holds
-  
-  @library
-  def newLazyCons[T](q: ConQ[T], st : Set[LazyConQ[T]]) : LazyConQ[T] = {
-    newConQ[T](Lazyarg1(q), st)
-  } ensuring(r => q match {
-      case Spine(_, rear) =>
-        !streamContains(rear, r)
-        case _ => true          
-    })
-
-  //@library
-  def pushLeftWrapper[T](ys : Single[T], w : Wrapper[T], st : Set[LazyConQ[T]]) = { 
-    require(w.valid(st) && ys.isInstanceOf[Single[T]])    
-    val (nq, nst) = pushLeft[T](ys, w.queue, st)    
-    val nsched = nq match {
-      case Spine(Empty(), rear18) =>
-        Cons[T](rear18, w.schedule)
-      case Spine(_, _) =>
-        w.schedule
-      case _ => Nil[T]() // if 'nq' is Tip don't add it to schedule
-    }           
-    val lq = newLazyCons(nq, nst)
-    val (_, rst) = evalLazyConQ(lq, nst)
-    (lq, nsched, rst)
-  } ensuring {res =>  
-    //zeroPreceedsLazy(res._1, res._3)    
-    schedulesProperty(res._1, res._2, res._3) &&     
-    // instantiations
-     (evalLazyConQS(res._1) match { 
-          case Spine(_, rear) =>
-            schedMonotone(st, res._3, res._2, rear, res._1)
-          case _ => true
-     })
-  }
-
-  @library
-  def dummyAxiom[T](l: LazyConQ[T], nl: LazyConQ[T]) : Boolean = {
-    !streamContains(l, nl)
-  } holds
-
-  def funeCompose[T](st1: Set[LazyConQ[T]], st2: Set[LazyConQ[T]], q : LazyConQ[T]) : Boolean = {
-    require(st1.subsetOf(st2))
-    (firstUnevaluated(firstUnevaluated(q, st1), st2) == firstUnevaluated(q, st2)) && //property
-    //induction scheme
-      (evalLazyConQS[T](q) match {
-        case Spine(_, tail) =>    
-            funeCompose(st1, st2, tail)
-        case _ =>          
-            true
-      }) 
-  } holds
-  
-  // induction following the structure of zeroPredLazy
-  def funeZeroProp[T](st1: Set[LazyConQ[T]], st2: Set[LazyConQ[T]], q : LazyConQ[T]) : Boolean = {
-    require(st1.subsetOf(st2) && {
- 	val x = firstUnevaluated(q, st1)
-	st2.contains(x) && weakZeroPreceedsLazy(x, st1)
-     })
-    zeroPreceedsLazy(q, st2) && //property
-    //induction scheme
-      (evalLazyConQS[T](q) match {
-        case Spine(Empty(), tail) =>    
-	    true
-	case Spine(_, tail) =>
- 	    (if(st1.contains(q))
-                 funeZeroProp(st1, st2, tail)
-	     else true)
-        case _ =>          
-            true
-      }) 
-  } holds
-
-  // induction following the structure of zeroPredLazy
-  /*def funeZeroProp2[T](st: Set[LazyConQ[T]], q : LazyConQ[T]) : Boolean = {
-     require(evalLazyConQS(firstUnevaluated(q, st)).isTip)
-    zeroPreceedsLazy(q, st) && //property
-    //induction scheme
-      (evalLazyConQS[T](q) match {
-        case Spine(Empty(), tail) =>    
-	    true
-	case Spine(_, tail) =>
-            funeZeroProp2(st, tail)
-        case _ =>          
-            true
-      }) 
-  } holds*/
-  
-  //@library
-  def Pay[T](q : LazyConQ[T], scheds : Scheds[T], st : Set[LazyConQ[T]]): (Scheds[T], Set[LazyConQ[T]]) = {
-    require(schedulesProperty(q, scheds, st) && st.contains(q))    
-    val (nschs, rst) = scheds match {
-      case c @ Cons(head, rest) =>
-        val (headval, st2) = evalLazyConQ(head, st)
-        (headval match {
-          case Spine(Empty(), rear) =>
-            evalLazyConQS(rear) match{ // note: here we are ignoring tip
-              case Spine(_, _) => 
-                Cons(rear, rest) 
-              case _ => rest
-	    }
-          case _ =>
-            rest  // in this case: firstUnevaluated[T](rear, st) == rhead  && firstUnevaluated[T](q, res._2) == rhead by funeCompose                                                
-        }, st2)
-      case Nil() =>        
-        (scheds, st)
-    }
-    (nschs, rst)
-  } ensuring {res =>
-     schedulesProperty(q, res._1, res._2) &&      
-     // instantiations (relating rhead and head)
-     funeCompose(st, res._2, q) && 
-     (res._1 match {
-      case Cons(rhead, rtail) =>	  
-          (scheds match {
-              case Cons(head, rest) =>                 
-		val p = pushUntilZero(rhead)
-                dummyAxiom(p, head) &&   
-		schedMonotone(st, res._2, rtail, p, head)
-                /*(evalLazyConQS(firstUnevaluated(q, st)).isTip ||		            		
-            		( (evalLazyConQS(head) match {
-            			case Spine(Empty(), rear) => 
-				   //firstUnevaluated(q, res._2) == rhead && 
-				   //schedulesProperty(pushUntilZero(rhead), rtail, st)      			           
-            			case Spine(_, rear) => 
-				  //firstUnevaluated(rear, st) == rhead &&                                        		  
-				  //schedulesProperty(pushUntilZero(head), res._1, st) &&
-				  //schedulesProperty(pushUntilZero(rhead), rtail, st) 				  
-	                          schedMonotone(st, res._2, rtail, pushUntilZero(rhead), head)
-            		}))
-                )*/		
-              case _ => true 		
-          })        
-      case _ => true          
-     }) &&
-     (scheds match {
-         case Cons(head, rest) =>                 
-		// establishing the zeroPreceedsLazy Property (on this case)
-	  	//fune(q, st) == head && weakZero(head, st) && res._2 == st ++ { head }
-		zeroPreceedsLazy(q, res._2) &&
-		funeZeroProp(st, res._2, q)  //instantiation
-	case _ => 
-		true
-		// here, we should have the list to be concrete
-     })
-  }
-
-  def pushLeftAndPay[T](ys : Single[T], w : Wrapper[T], st : Set[LazyConQ[T]]): (Wrapper[T], Set[LazyConQ[T]]) = {
-    require(w.valid(st) && ys.isInstanceOf[Single[T]] && w.schedule != Nil[T]())
-    val (q, scheds, nst) =  pushLeftWrapper(ys, w, st)
-    val (nscheds, fst) = Pay(q, scheds, nst)
-    (Wrapper(q, nscheds), fst)
-  } ensuring {res => res._1.valid(res._2) }
-  
-  def lazyarg1[T](x : ConQ[T]): ConQ[T] = x
-}
-
-object ConQ {  
-  
-  abstract class LazyConQ[T1]
-  
-  case class Lazyarg1[T](x : ConQ[T]) extends LazyConQ[T]
-  
-  case class PushLeftLazy[T](ys : Conc[T], xs : LazyConQ[T]) extends LazyConQ[T]
-  
-  @library
-  def newConQ[T1](cc : LazyConQ[T1], st : Set[LazyConQ[T1]]): LazyConQ[T1] =  {
-    cc
-  } ensuring {
-    (res : LazyConQ[T1]) => !st.contains(res)
-  }
-  
-  @library
-  def evalLazyConQ[T](cl : LazyConQ[T], st : Set[LazyConQ[T]]): (ConQ[T], Set[LazyConQ[T]]) =  {
-    cl match {
-      case t : Lazyarg1[T] =>
-        (t.x, (st ++ Set[LazyConQ[T]](t)))
-      case t : PushLeftLazy[T] =>
-	val plres = pushLeftLazyUI[T](t.ys, t.xs, uiState())._1
-	val plst = pushLeftLazyUI[T](t.ys, t.xs, st)._2
-        (plres, (plst ++ Set[LazyConQ[T]](t)))
-    }
-  } ensuring {
-    (res : (ConQ[T], Set[LazyConQ[T]])) => (cl match {
-      case t : Lazyarg1[T] =>      
-	res._1 match {
-	 case Spine(_, r) => weakZeroPreceedsLazy(r, st)  // need to be autogen
-	 case _ => true
-        }
-      case t : PushLeftLazy[T] =>        
-	true
-     })
-  }
-
-  def uiState[T]() : Set[LazyConQ[T]] = ???[Set[LazyConQ[T]]]
-  
-  def evalLazyConQS[T](cl : LazyConQ[T]): ConQ[T] = evalLazyConQ[T](cl, uiState())._1
- 
-}
diff --git a/leon.out/Conqueue-strategy2.scala b/leon.out/Conqueue-strategy2.scala
new file mode 100644
index 0000000000000000000000000000000000000000..2cadbadd3b288637c2d20223d92033ee577b1a15
--- /dev/null
+++ b/leon.out/Conqueue-strategy2.scala
@@ -0,0 +1,633 @@
+package lazybenchmarks
+import leon.lazyeval._
+import leon.lang._
+import leon.annotation._
+import leon.instrumentation._
+import leon.lang.synthesis._
+import ConcTrees._
+import ConQ._
+import Conqueue._
+
+object ConcTrees {
+  abstract class Conc[T] {
+    def isEmpty: Boolean = this == Empty[T]()
+
+    def level: BigInt = {
+      this match {
+        case Empty() =>
+          BigInt(0)
+        case Single(x) =>
+          BigInt(0)
+        case CC(l, r) =>
+          BigInt(1) + max(l.level, r.level)
+      }
+    } ensuring {
+      (x$1: BigInt) => x$1 >= BigInt(0)
+    }
+  }
+
+  case class Empty[T]() extends Conc[T]
+
+  case class Single[T](x: T) extends Conc[T]
+
+  case class CC[T](left: Conc[T], right: Conc[T]) extends Conc[T]
+
+  def max(x: BigInt, y: BigInt): BigInt = if (x >= y) {
+    x
+  } else {
+    y
+  }
+
+  def abs(x: BigInt): BigInt = if (x < BigInt(0)) {
+    -x
+  } else {
+    x
+  }
+}
+
+object Conqueue {
+  abstract class ConQ[T] {
+    def isSpine: Boolean = this match {
+      case Spine(_, _, _) =>
+        true
+      case _ =>
+        false
+    }
+    def isTip: Boolean = !this.isSpine
+  }
+
+  case class Tip[T](t: Conc[T]) extends ConQ[T]
+
+  case class Spine[T](head: Conc[T], createdWithSusp: Bool, rear: LazyConQ[T]) extends ConQ[T]
+
+  sealed abstract class Bool
+  case class True() extends Bool
+  case class False() extends Bool
+
+  abstract class Scheds[T]
+
+  case class Cons[T](h: LazyConQ[T], tail: Scheds[T]) extends Scheds[T]
+
+  case class Nil[T]() extends Scheds[T]
+
+  case class Wrapper[T](queue: LazyConQ[T], schedule: Scheds[T]) {
+    def valid(st: Set[LazyConQ[T]]): Boolean = zeroPreceedsLazy[T](this.queue, st) &&
+      // schedulesProperty(this.queue, this.schedule, st)
+      strongSchedsProp(this.queue, this.schedule, st)
+  }
+
+  def zeroPreceedsLazy[T](q: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    if (isEvaluated(q, st)) {
+      evalLazyConQS[T](q) match {
+        case Spine(Empty(), _, rear) => true
+        case Spine(_, _, rear) =>
+          zeroPreceedsLazy[T](rear, st)
+        case Tip(_) => true
+      }
+    } else false
+  }
+
+  // not used, but still interesting
+  def zeroPredLazyMonotone[T](st1: Set[LazyConQ[T]], st2: Set[LazyConQ[T]], q: LazyConQ[T]): Boolean = {
+    require(st1.subsetOf(st2) && zeroPreceedsLazy(q, st1))
+    zeroPreceedsLazy(q, st2) &&
+      //induction scheme
+      (evalLazyConQS[T](q) match {
+        case Spine(Empty(), _, _) =>
+          true
+        case Spine(h, _, rear) =>
+          zeroPredLazyMonotone(st1, st2, rear)
+        case Tip(_) =>
+          true
+      })
+  } holds
+
+  def weakZeroPreceedsLazy[T](q: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    evalLazyConQS[T](q) match {
+      case Spine(Empty(), _, rear10) =>
+        true
+      case Spine(h, _, rear11) =>
+        zeroPreceedsLazy[T](rear11, st)
+      case Tip(_) =>
+        true
+    }
+  }
+
+  def firstUnevaluated[T](l: LazyConQ[T], st: Set[LazyConQ[T]]): LazyConQ[T] = {
+    if (isEvaluated(l, st)) {
+      evalLazyConQS[T](l) match {
+        case Spine(_, _, tail15) =>
+          firstUnevaluated[T](tail15, st)
+        case _ =>
+          l
+      }
+    } else {
+      l
+    }
+  } ensuring {
+    (res65: LazyConQ[T]) =>
+      (evalLazyConQS[T](res65).isTip || !st.contains(res65)) &&
+        concreteUntil(l, res65, st)
+  }
+
+  def nextUnevaluatedLemma[T](l: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    val (res, nst) = evalLazyConQ[T](firstUnevaluated(l, st), st)
+    (res match {
+      case Spine(_, _, tail) =>
+        firstUnevaluated[T](l, nst) == firstUnevaluated[T](tail, nst)
+      case _ =>
+        true
+    }) &&
+      // induction scheme
+      (evalLazyConQS[T](l) match {
+        case Spine(_, _, tail) =>
+          nextUnevaluatedLemma(tail, st)
+        case _ => true
+      })
+  } holds
+
+  /**
+   * Everything until suf is evaluated
+   */
+  def concreteUntil[T](l: LazyConQ[T], suf: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    if (l != suf) {
+      isEvaluated(l, st) && (evalLazyConQS[T](l) match {
+        case Spine(_, cws, tail15) =>
+          concreteUntil(tail15, suf, st)
+        case _ =>
+          true
+      })
+    } else true
+  }
+
+  def isConcrete[T](l: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = isEvaluated(l, st) && (evalLazyConQS[T](l) match {
+    case Spine(_, _, tail13) =>
+      isConcrete[T](tail13, st)
+    case _ =>
+      true
+  })
+
+  def schedulesProperty[T](q: LazyConQ[T], schs: Scheds[T], st: Set[LazyConQ[T]]): Boolean = {
+    schs match {
+      case Cons(head, tail) =>
+        evalLazyConQS[T](head) match {
+          case Spine(Empty(), _, _) =>
+            !head.isInstanceOf[Eager[T]] &&
+              concreteUntil(q, head, st) && schedulesProperty[T](pushUntilZero[T](head), tail, st)
+          case _ =>
+            false
+        }
+      case Nil() =>
+        isConcrete(q, st)
+    }
+  }
+
+  def strongSchedsProp[T](q: LazyConQ[T], schs: Scheds[T], st: Set[LazyConQ[T]]) = {
+    isEvaluated(q, st) && {
+      val l = evalLazyConQS[T](q) match {
+        case Spine(_, _, rear) =>
+          rear
+        case _ => q
+      }
+      schedulesProperty(l, schs, st)
+    }
+  }
+
+  // pushes a carry until there is a one
+  // TODO: rename this to pushUntilCarry
+  def pushUntilZero[T](q: LazyConQ[T]): LazyConQ[T] = evalLazyConQS[T](q) match {
+    case Spine(Empty(), _, rear12) =>
+      pushUntilZero[T](rear12)
+    case Spine(h, _, rear13) =>
+      rear13
+    case Tip(_) =>
+      q
+  }
+
+  def pushLeft[T](ys: Single[T], xs: LazyConQ[T], st: Set[LazyConQ[T]]): (ConQ[T], Set[LazyConQ[T]]) = {
+    require(zeroPreceedsLazy[T](xs, st) && ys.isInstanceOf[Single[T]])
+
+    val dres5 = evalLazyConQ[T](xs, st)
+    dres5._1 match {
+      case Tip(CC(_, _)) =>
+        (Spine[T](ys, False(), xs), dres5._2)
+      case Tip(Empty()) =>
+        (Tip[T](ys), dres5._2)
+      case Tip(t @ Single(_)) =>
+        (Tip[T](CC[T](ys, t)), dres5._2)
+      case s @ Spine(Empty(), _, rear) =>
+        (Spine[T](ys, False(), rear), dres5._2) // in this case firstUnevaluated[T](rear, st) == firstUnevaluated[T](xs, st)
+      case s @ Spine(_, _, _) =>
+        pushLeftLazy[T](ys, xs, dres5._2)
+    }
+  }
+
+/*  def dummyFun[T]() = ???[(ConQ[T], Set[LazyConQ[T]])]
+
+  @library
+  def pushLeftLazyUI[T](ys: Conc[T], xs: LazyConQ[T], st: Set[LazyConQ[T]]): (ConQ[T], Set[LazyConQ[T]]) = {
+    dummyFun()
+  } ensuring (res => res._2 == st && (res._1 match {
+    case Spine(Empty(), createdWithSusp, rear) =>
+      true
+    case _ => false
+  }))*/
+
+  def pushLeftLazy[T](ys: Conc[T], xs: LazyConQ[T], st: Set[LazyConQ[T]]): (ConQ[T], Set[LazyConQ[T]]) = {
+    require(!ys.isEmpty && zeroPreceedsLazy[T](xs, st) &&
+      (evalLazyConQS(xs) match {
+        case Spine(h, _, _) => h != Empty[T]()
+        case _              => false
+      }))
+    val dres = evalLazyConQ[T](xs, st)
+    dres._1 match {
+      case Spine(head, _, rear15) =>
+        val carry = CC[T](head, ys)
+        val dres2 = evalLazyConQ[T](rear15, dres._2)
+        dres2._1 match {
+          case s @ Spine(Empty(), _, _) =>
+            (Spine[T](Empty[T](), False(), Eager(Spine(carry, False(), rear15))), dres2._2)
+          case s @ Spine(_, _, _) =>
+            (Spine[T](Empty[T](), True(), PushLeftLazy[T](carry, rear15 /*, suf*/ )), dres2._2)
+          case t @ Tip(tree) =>
+            if (tree.level > carry.level) { // can this happen ? this means tree is of level at least two greater than rear ?
+              val x: ConQ[T] = t
+              (Spine[T](Empty[T](), False(), Eager(Spine(carry, False(), rear15))), dres2._2)
+            } else { // here tree level and carry level are equal
+              val x: ConQ[T] = Tip[T](CC[T](tree, carry))
+              (Spine[T](Empty[T](), False(), Eager(Spine(Empty[T](), False(), Eager[T](x)))), dres2._2)
+            }
+        }
+    }
+  } ensuring {
+    (res66: (Spine[T], Set[LazyConQ[T]])) =>
+      (res66._2 == st) && (res66._1 match {
+        case Spine(Empty(), createdWithSusp, rear) =>
+          val (rearval, _) = evalLazyConQ[T](rear, st) // this is necessary to assert properties on the state in the recursive invocation
+          (!isConcrete(xs, st) || isConcrete(pushUntilZero(rear), st))
+        case _ =>
+          false
+      })
+  }
+
+  def pushLeftLazyLemma[T](ys: Conc[T], xs: LazyConQ[T], suf: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    require(!ys.isEmpty && zeroPreceedsLazy[T](xs, st) &&
+      (evalLazyConQS(xs) match {
+        case Spine(h, _, _) => h != Empty[T]()
+        case _              => false
+      }) &&
+      (evalLazyConQS(suf) match {
+        case Spine(Empty(), _, _) =>
+          concreteUntil(xs, suf, st)
+        case _ => false
+      }))
+    // property
+    (pushLeftLazy(ys, xs, st)._1 match {
+      case Spine(Empty(), createdWithSusp, rear) =>
+        // forall suf. suf*.head == Empty() ^ concUntil(xs, suf, st) => concUntil(push(rear), suf, st)
+        val p = pushUntilZero[T](rear)
+        concreteUntil(p, suf, st)
+    }) &&
+      // induction scheme
+      (evalLazyConQS(xs) match {
+        case Spine(head, _, rear15) =>
+          val carry = CC[T](head, ys)
+          evalLazyConQS(rear15) match {
+            case s @ Spine(h, _, _) if h != Empty[T]() =>
+              pushLeftLazyLemma(carry, rear15, suf, st)
+            case _ => true
+          }
+      })
+  } holds
+
+  def pushLeftWrapper[T](ys: Single[T], w: Wrapper[T], st: Set[LazyConQ[T]]) = {
+    require(w.valid(st) && ys.isInstanceOf[Single[T]])
+    val (nq, nst) = pushLeft[T](ys, w.queue, st)
+    val nsched = nq match {
+      case Spine(Empty(), createdWithSusp, rear18) if createdWithSusp == True() =>
+        Cons[T](rear18, w.schedule) // this is the only case where we create a new lazy closure
+      case _ =>
+        w.schedule
+    }
+    (Eager(nq), nsched, nst)
+  } ensuring { res =>
+    strongSchedsProp(res._1, res._2, res._3) &&
+      // lemma instantiations
+      (w.schedule match {
+        case Cons(head, tail) =>
+          evalLazyConQS(w.queue) match {
+            case Spine(h, _, _) if h != Empty[T]() =>
+              pushLeftLazyLemma(ys, w.queue, head, st)
+            case _ => true
+          }
+        case _ => true
+      })
+  }
+
+  /*def PushLeftLazypushLeftLazyLem[T](rear15 : LazyConQ[T], head : Conc[T], dres : (ConQ[T], Set[LazyConQ[T]]), st : Set[LazyConQ[T]], xs : LazyConQ[T], s : Spine[T], dres : (ConQ[T], Set[LazyConQ[T]]), carry : CC[T], ys : Conc[T]): Boolean =  {
+    (!ys.isEmpty && zeroPreceedsLazy[T](xs, st) && evalLazyConQS[T](xs).isSpine && dres == evalLazyConQ[T](xs, st) && (!dres._1.isInstanceOf[Spine[T]] || !dres._1.head.isInstanceOf[Empty[T]]) && dres._1.isInstanceOf[Spine[T]] && head == dres._1.head && rear15 == dres._1.rear && carry == CC[T](head, ys) && dres == evalLazyConQ[T](rear15, dres._2) && (!dres._1.isInstanceOf[Spine[T]] || !dres._1.head.isInstanceOf[Empty[T]]) && dres._1.isInstanceOf[Spine[T]] && s == dres._1) ==> (!carry.isEmpty && zeroPreceedsLazy[T](rear15, dres._2) && evalLazyConQS[T](rear15).isSpine)
+  } ensuring {
+    (holds : Boolean) => holds
+  }*/
+
+  def schedMonotone[T](st1: Set[LazyConQ[T]], st2: Set[LazyConQ[T]], scheds: Scheds[T], l: LazyConQ[T]): Boolean = {
+    require(st1.subsetOf(st2) && schedulesProperty(l, scheds, st1))
+    schedulesProperty(l, scheds, st2) && //property
+      //induction scheme
+      (scheds match {
+        case Cons(head, tail) =>
+          evalLazyConQS[T](head) match {
+            case Spine(_, _, rear) =>
+              concUntilMonotone(l, head, st1, st2) &&
+                schedMonotone(st1, st2, tail, pushUntilZero(head))
+            case _ => true
+          }
+        case Nil() =>
+          concreteMonotone(st1, st2, l)
+      })
+  } holds
+
+  def concreteMonotone[T](st1: Set[LazyConQ[T]], st2: Set[LazyConQ[T]], l: LazyConQ[T]): Boolean = {
+    require(isConcrete(l, st1) && st1.subsetOf(st2))
+    isConcrete(l, st2) && {
+      // induction scheme
+      evalLazyConQS[T](l) match {
+        case Spine(_, _, tail) =>
+          concreteMonotone[T](st1, st2, tail)
+        case _ =>
+          true
+      }
+    }
+  } holds
+
+  def concUntilExtenLemma[T](q: LazyConQ[T], suf: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    require(concreteUntil(q, suf, st))
+    val (next, nst) = evalLazyConQ[T](suf, st)
+    (next match {
+      case Spine(_, _, rear) =>
+        concreteUntil(q, rear, nst)
+      case _ => true
+    }) &&
+      (if (q != suf) {
+        evalLazyConQS[T](q) match {
+          case Spine(_, _, tail) =>
+            concUntilExtenLemma(tail, suf, st)
+          case _ =>
+            true
+        }
+      } else true)
+  } holds
+
+  def concUntilExtenLemma2[T](q: LazyConQ[T], suf: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    require(concreteUntil(q, suf, st) && isConcrete(suf, st))
+    isConcrete(q, st) &&
+      (if (q != suf) {
+        evalLazyConQS[T](q) match {
+          case Spine(_, _, tail) =>
+            concUntilExtenLemma2(tail, suf, st)
+          case _ =>
+            true
+        }
+      } else true)
+  } ensuring (_ == true)
+
+  def concUntilCompose[T](q: LazyConQ[T], suf1: LazyConQ[T], suf2: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    require(concreteUntil(q, suf1, st) && concreteUntil(suf1, suf2, st))
+    concreteUntil(q, suf2, st) &&
+      (if (q != suf1) {
+        evalLazyConQS[T](q) match {
+          case Spine(_, _, tail) =>
+            concUntilCompose(tail, suf1, suf2, st)
+          case _ =>
+            true
+        }
+      } else true)
+  } ensuring (_ == true)
+
+  def concUntilMonotone[T](q: LazyConQ[T], suf: LazyConQ[T], st1: Set[LazyConQ[T]], st2: Set[LazyConQ[T]]): Boolean = {
+    require(concreteUntil(q, suf, st1) && st1.subsetOf(st2))
+    concreteUntil(q, suf, st2) &&
+      (if (q != suf) {
+        evalLazyConQS[T](q) match {
+          case Spine(_, _, tail) =>
+            concUntilMonotone(tail, suf, st1, st2)
+          case _ =>
+            true
+        }
+      } else true)
+  } ensuring (_ == true)
+
+  def concUntilZeroPredLemma[T](q: LazyConQ[T], suf: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    require(concreteUntil(q, suf, st) && (evalLazyConQS(suf) match {
+      case Spine(Empty(), _, _) => true
+      case _                    => false
+    }))
+    val (next, nst) = evalLazyConQ[T](suf, st)
+    zeroPreceedsLazy(q, nst) &&
+      (if (q != suf) {
+        evalLazyConQS[T](q) match {
+          case Spine(_, _, tail) =>
+            concUntilZeroPredLemma(tail, suf, st)
+          case _ =>
+            true
+        }
+      } else true)
+  } holds
+
+  def concreteZeroPredLemma[T](q: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    require(isConcrete(q, st))
+    zeroPreceedsLazy(q, st) && {
+      // induction scheme
+      evalLazyConQS[T](q) match {
+        case Spine(_, _, tail) =>
+          concreteZeroPredLemma[T](tail, st)
+        case _ =>
+          true
+      }
+    }
+  } holds
+
+  def Pay[T](q: LazyConQ[T], scheds: Scheds[T], st: Set[LazyConQ[T]]): (Scheds[T], Set[LazyConQ[T]]) = {
+    require(strongSchedsProp(q, scheds, st) && isEvaluated(q, st))
+    val (nschs, rst) = scheds match {
+      case c @ Cons(head, rest) =>
+        val (headval, st2) = evalLazyConQ(head, st)
+        (headval match {
+          case Spine(Empty(), createdWithSusp, rear) => // note: no other case is possible
+            if (createdWithSusp == True()) {
+              Cons(rear, rest)
+            } else
+              rest
+          //            In this case,
+          //              val prear = pushUntilZero(rear)
+          //            	concreteUntil(q, rhead, res._2) && concreteUntil(prear, rhead, st) && concreteUntil(rear, rhead, st) && schedulesProperty(prhead, rtail, st)
+        }, st2)
+      case Nil() =>
+        (scheds, st)
+    }
+    (nschs, rst)
+  } ensuring { res =>
+    val l = evalLazyConQS[T](q) match {
+      case Spine(_, _, rear) =>
+        rear
+      case _ => q
+    }
+    strongSchedsProp(q, res._1, res._2) &&
+      zeroPreceedsLazy(q, res._2) &&
+      (scheds match {
+        case Cons(head, rest) =>
+          concUntilExtenLemma(l, head, st) &&
+            (res._1 match {
+              case Cons(rhead, rtail) =>
+                val prhead = pushUntilZero(rhead)
+                schedMonotone(st, res._2, rtail, prhead) &&
+                  (evalLazyConQS(head) match {
+                    case Spine(Empty(), cws, rear) =>
+                      if (cws == False()) {
+                        concUntilMonotone(rear, rhead, st, res._2) &&
+                          concUntilCompose(l, rear, rhead, res._2)
+                      } else true
+                  })
+              case _ =>
+                evalLazyConQS(head) match {
+                  case Spine(Empty(), _, rear) =>
+                    concreteMonotone(st, res._2, rear) &&
+                      concUntilExtenLemma2(l, rear, res._2)
+                }
+            })
+        case _ => true
+      }) &&
+      // instantiations for zeroPreceedsLazy
+      (scheds match {
+        case Cons(head, rest) =>
+          //concUntil(l, head, st) == head && head* == Spine(Empty(), _) && res._2.subsetOf(st ++ { head })
+          concUntilZeroPredLemma(l, head, st)
+        case _ =>
+          concreteZeroPredLemma(q, res._2)
+      })
+  }
+
+  def pushLeftAndPay[T](ys: Single[T], w: Wrapper[T], st: Set[LazyConQ[T]]) = {
+    require(w.valid(st) && ys.isInstanceOf[Single[T]])
+    val (q, scheds, nst) = pushLeftWrapper(ys, w, st)
+    val (nscheds, fst) = Pay(q, scheds, nst)
+    (Wrapper(q, nscheds), fst)
+  } ensuring { res => res._1.valid(res._2) }
+
+  def lazyarg1[T](x: ConQ[T]): ConQ[T] = x
+}
+
+object ConQ {
+
+  abstract class LazyConQ[T1]
+
+  case class Eager[T](x: ConQ[T]) extends LazyConQ[T]
+
+  case class PushLeftLazy[T](ys: Conc[T], xs: LazyConQ[T] /*, suf: LazyConQ[T]*/ ) extends LazyConQ[T]
+
+  @library
+  def newConQ[T1](cc: LazyConQ[T1], st: Set[LazyConQ[T1]]): LazyConQ[T1] = {
+    cc
+  } ensuring {
+    (res: LazyConQ[T1]) => !st.contains(res)
+  }
+
+  @library
+  def evalLazyConQ[T](cl: LazyConQ[T], st: Set[LazyConQ[T]]): (ConQ[T], Set[LazyConQ[T]]) = {
+    cl match {
+      case t: Eager[T] =>
+        (t.x, st)
+      case t: PushLeftLazy[T] =>
+        val plres = pushLeftLazy[T](t.ys, t.xs, uiState())._1
+        val plst = pushLeftLazy[T](t.ys, t.xs, st)._2
+        //val plres = pushLeftLazyUI[T](t.ys, t.xs, uiState())._1
+        //val plst = pushLeftLazyUI[T](t.ys, t.xs, st)._2
+        (plres, (plst ++ Set[LazyConQ[T]](t)))
+    }
+  }
+
+  def isEvaluated[T](cl: LazyConQ[T], st: Set[LazyConQ[T]]) = st.contains(cl) || cl.isInstanceOf[Eager[T]]
+
+  def uiState[T](): Set[LazyConQ[T]] = ???[Set[LazyConQ[T]]]
+
+  def evalLazyConQS[T](cl: LazyConQ[T]): ConQ[T] = evalLazyConQ[T](cl, uiState())._1
+
+}
+
+// Cases that had to be considered for pushLeftWrapper
+      /*(w.schedule match {
+      case Cons(head, tail) => true
+        //strongSchedsProp(res._1, res._2, res._3)
+      case _ =>
+        res._2 match {
+          case Nil() => true
+            //strongSchedsProp(res._1, res._2, res._3)
+          case _ =>
+            strongSchedsProp(res._1, res._2, res._3)
+        }
+    }) &&*/
+      /*(//pushLeft(ys, w.queue, st)._1 match {
+      //case Spine(Empty(), createdWithSusp, _) if createdWithSusp == True() => true
+        /*(w.schedule match {
+          case Cons(head, tail) =>
+            schedulesProperty(res._1, res._2, res._3)
+          case _ => true
+        })*/
+      //case Spine(Empty(), _, _)  => true
+        /*(w.schedule match {
+          case Cons(head, tail) =>
+            schedulesProperty(res._1, res._2, res._3)
+          case _ => true
+        })*/
+      //case Spine(_, _, _) =>
+        /*(w.schedule match {
+          case Cons(head, tail) =>
+            schedulesProperty(res._1, res._2, res._3)
+          case _ => true
+        })*/
+      //case _ => true
+        //schedulesProperty(res._1, res._2, res._3)
+    })*/
+
+  /*def payLemma[T](q: LazyConQ[T], scheds: Scheds[T], st: Set[LazyConQ[T]]) = {
+    require(strongSchedsProp(q, scheds, st) && isEvaluated(q, st))
+    val res = Pay(q, scheds, st)
+    val l = evalLazyConQS[T](q) match {
+      case Spine(_, _, rear) =>
+        rear
+      case _ => q
+    }
+    strongSchedsProp(q, res._1, res._2) &&
+      zeroPreceedsLazy(q, res._2) &&
+      (scheds match {
+        case Cons(head, rest) =>
+          concUntilExtenLemma(l, head, st) &&
+            (res._1 match {
+              case Cons(rhead, rtail) =>
+                val prhead = pushUntilZero(rhead)
+                schedMonotone(st, res._2, rtail, prhead) &&
+                  (evalLazyConQS(head) match {
+                    case Spine(Empty(), cws, rear) =>
+                      if (cws == False()) {
+                        concUntilMonotone(rear, rhead, st, res._2) &&
+                          concUntilCompose(l, rear, rhead, res._2)
+                      } else true
+                  })
+              case _ =>
+                evalLazyConQS(head) match {
+                  case Spine(Empty(), _, rear) =>
+                    concreteMonotone(st, res._2, rear) &&
+                      concUntilExtenLemma2(l, rear, res._2)
+                }
+            })
+        case _ => true
+      }) &&
+      // instantiations for zeroPreceedsLazy
+      (scheds match {
+        case Cons(head, rest) =>
+          //concUntil(l, head, st) == head && head* == Spine(Empty(), _) && res._2.subsetOf(st ++ { head })
+          concUntilZeroPredLemma(l, head, st)
+        case _ =>
+          concreteZeroPredLemma(q, res._2)
+      })
+  } ensuring (_ == true)*/
\ No newline at end of file
diff --git a/library/lazy/package.scala b/library/lazy/package.scala
index d706f0ec3d55902f8d0d5695b7bda08670560849..448eb2a3deb45807c27d45e0a2629ba086236343 100644
--- a/library/lazy/package.scala
+++ b/library/lazy/package.scala
@@ -13,12 +13,35 @@ object $ {
   /**
    * implicit conversion from eager evaluation to lazy evaluation
    */
-  @inline //inlining call-by-name here
+  @inline
   implicit def eagerToLazy[T](x: T) = eager(x)
+
+  /**
+   * accessing in and out states.
+   * Should be used only in ensuring.
+   * TODO: enforce this.
+   */
+  @extern
+  def inState[T] : Set[$[T]] = sys.error("inState method is not executable!")
+
+  @extern
+  def outState[T] : Set[$[T]] = sys.error("outState method is not executable")
+
+  /**
+   * Helper class for invoking with a given state instead of the implicit state
+   */
+  @library
+  case class WithState[T](v: T) {
+    @extern
+    def withState[U](x: Set[$[U]]): T = sys.error("withState method is not executable!")
+  }
+
+  @inline
+  implicit def toWithState[T](x: T) = new WithState(x)
 }
 
 @library
-case class $[T](f: Unit => T) { // leon does not support call by name as of now
+case class $[T](f: Unit => T) {
   @extern
   lazy val value = {
     val x = f(())
@@ -32,4 +55,7 @@ case class $[T](f: Unit => T) { // leon does not support call by name as of now
 
   @extern
   def isEvaluated = eval
+
+  @extern
+  def isSuspension[T](f: T) : Boolean = sys.error("isSuspensionOf method is not executable")
 }
diff --git a/src/main/scala/leon/invariant/structure/FunctionUtils.scala b/src/main/scala/leon/invariant/structure/FunctionUtils.scala
index ed01bdd3d6c848df34656970eb7ddf2097cb18d5..3124f9d037b2c313b58c6704874dd70977b015a1 100644
--- a/src/main/scala/leon/invariant/structure/FunctionUtils.scala
+++ b/src/main/scala/leon/invariant/structure/FunctionUtils.scala
@@ -28,6 +28,9 @@ object FunctionUtils {
     lazy val isDistributive = fd.annotations.contains("distributive")
     lazy val compose = fd.annotations.contains("compose")
     lazy val isLibrary = fd.annotations.contains("library")
+    lazy val hasFieldFlag = fd.flags.contains(IsField(false))
+    lazy val hasLazyFieldFlag = fd.flags.contains(IsField(true))
+    lazy val isUserFunction = !hasFieldFlag && !hasLazyFieldFlag
 
     //the template function
     lazy val tmplFunctionName = "tmpl"
diff --git a/src/main/scala/leon/invariant/util/CallGraph.scala b/src/main/scala/leon/invariant/util/CallGraph.scala
index fafa24c6c56a47d38220e93a8e1dd8d34916177a..526b428a96b9bd1ae06152893e182e8a524112f3 100644
--- a/src/main/scala/leon/invariant/util/CallGraph.scala
+++ b/src/main/scala/leon/invariant/util/CallGraph.scala
@@ -109,14 +109,15 @@ class CallGraph {
 
 object CallGraphUtil {
 
-  def constructCallGraph(prog: Program, onlyBody: Boolean = false, withTemplates: Boolean = false): CallGraph = {
-//
-    // println("Constructing call graph")
+  def constructCallGraph(prog: Program,
+      onlyBody: Boolean = false,
+      withTemplates: Boolean = false,
+      calleesFun: Expr => Set[FunDef] = getCallees): CallGraph = {
+
     val cg = new CallGraph()
     functionsWOFields(prog.definedFunctions).foreach((fd) => {
       cg.addFunction(fd)
       if (fd.hasBody) {
-        // println("Adding func " + fd.id.uniqueName)
         var funExpr = fd.body.get
         if (!onlyBody) {
           if (fd.hasPrecondition)
@@ -127,9 +128,8 @@ object CallGraphUtil {
         if (withTemplates && fd.hasTemplate) {
           funExpr = Tuple(Seq(funExpr, fd.getTemplate))
         }
-
         //introduce a new edge for every callee
-        getCallees(funExpr).foreach(cg.addEdgeIfNotPresent(fd, _))
+        calleesFun(funExpr).foreach(cg.addEdgeIfNotPresent(fd, _))
       }
     })
     cg
diff --git a/src/main/scala/leon/laziness/LazinessEliminationPhase.scala b/src/main/scala/leon/laziness/LazinessEliminationPhase.scala
index 5425760483dd4275bde55ec502ba8454a14f54ed..00031ad846c87bd9d32717c067e331dd710bb95d 100644
--- a/src/main/scala/leon/laziness/LazinessEliminationPhase.scala
+++ b/src/main/scala/leon/laziness/LazinessEliminationPhase.scala
@@ -39,15 +39,15 @@ import PredicateUtil._
 object LazinessEliminationPhase extends TransformationPhase {
   val debugLifting = false
   val dumpInputProg = false
-  val dumpLiftProg = false
-  val dumpProgramWithClosures = false
+  val dumpLiftProg = true
+  val dumpProgramWithClosures = true
   val dumpTypeCorrectProg = false
-  val dumpProgWithPreAsserts = true
-  val dumpProgWOInstSpecs = false
-  val dumpInstrumentedProgram = false
+  val dumpProgWithPreAsserts = false
+  val dumpProgWOInstSpecs = true
+  val dumpInstrumentedProgram = true
   val debugSolvers = false
-  val skipVerification = false
-  val prettyPrint = false
+  val skipStateVerification = false
+  val skipResourceVerification = false
   val debugInstVCs = false
 
   val name = "Laziness Elimination Phase"
@@ -70,12 +70,15 @@ object LazinessEliminationPhase extends TransformationPhase {
     if (dumpInputProg)
       println("Input prog: \n" + ScalaPrinter.apply(prog))
     val nprog = liftLazyExpressions(prog)
-    if(dumpLiftProg)
-      println("After lifting expressions: \n" + ScalaPrinter.apply(nprog))
+    if(dumpLiftProg) {
+      prettyPrintProgramToFile(nprog, ctx, "-lifted")
+    }
 
     val progWithClosures = (new LazyClosureConverter(nprog, ctx, new LazyClosureFactory(nprog))).apply
-    if (dumpProgramWithClosures)
-      println("After closure conversion: \n" + ScalaPrinter.apply(progWithClosures))
+    if (dumpProgramWithClosures) {
+      //println("After closure conversion: \n" + ScalaPrinter.apply(progWithClosures, purescala.PrinterOptions(printUniqueIds = true)))
+      prettyPrintProgramToFile(progWithClosures, ctx, "-closures")
+    }
 
     //Rectify type parameters and local types
     val typeCorrectProg = (new TypeRectifier(progWithClosures, tp => tp.id.name.endsWith("@"))).apply
@@ -85,29 +88,28 @@ object LazinessEliminationPhase extends TransformationPhase {
     val progWithPre = (new ClosurePreAsserter(typeCorrectProg)).apply
     if (dumpProgWithPreAsserts) {
       //println("After asserting closure preconditions: \n" + ScalaPrinter.apply(progWithPre))
-      prettyPrintProgramToFile(progWithPre, ctx)
+      prettyPrintProgramToFile(progWithPre, ctx, "-withpre")
     }
 
     // verify the contracts that do not use resources
     val progWOInstSpecs = removeInstrumentationSpecs(progWithPre)
     if (dumpProgWOInstSpecs) {
-      println("After removing instrumentation specs: \n" + ScalaPrinter.apply(progWOInstSpecs))
+      //println("After removing instrumentation specs: \n" + ScalaPrinter.apply(progWOInstSpecs))
+      prettyPrintProgramToFile(progWOInstSpecs, ctx, "-woinst")
     }
-    if (!skipVerification)
+    if (!skipStateVerification)
       checkSpecifications(progWOInstSpecs)
 
     // instrument the program for resources (note: we avoid checking preconditions again here)
-    val instProg = (new LazyInstrumenter(typeCorrectProg)).apply
-    if (dumpInstrumentedProgram){
-      println("After instrumentation: \n" + ScalaPrinter.apply(instProg))
-      prettyPrintProgramToFile(instProg, ctx)
+    val instrumenter = new LazyInstrumenter(typeCorrectProg)
+    val instProg = instrumenter.apply
+    if (dumpInstrumentedProgram) {
+      //println("After instrumentation: \n" + ScalaPrinter.apply(instProg))
+      prettyPrintProgramToFile(instProg, ctx, "-withinst")
     }
-
     // check specifications (to be moved to a different phase)
-    if (!skipVerification)
+    if (!skipResourceVerification)
       checkInstrumentationSpecs(instProg)
-    if (prettyPrint)
-      prettyPrintProgramToFile(instProg, ctx)
     instProg
   }
 
@@ -183,6 +185,18 @@ object LazinessEliminationPhase extends TransformationPhase {
     nprog
   }
 
+  /**
+   * Returns a constructor for the let* and also the current
+   * body of let*
+   */
+  def letStarUnapply(e: Expr): (Expr => Expr, Expr) = e match {
+      case Let(binder, letv, letb) =>
+        val (cons, body) = letStarUnapply(letb)
+        (e => Let(binder, letv, cons(e)), letb)
+      case base =>
+        (e => e, base)
+    }
+
   def removeInstrumentationSpecs(p: Program): Program = {
     def hasInstVar(e: Expr) = {
       exists { e => InstUtil.InstTypes.exists(i => i.isInstVariable(e)) }(e)
@@ -193,7 +207,14 @@ object LazinessEliminationPhase extends TransformationPhase {
         val Lambda(resdef, pbody) = fd.postcondition.get
         val npost = pbody match {
           case And(args) =>
-            PredicateUtil.createAnd(args.filterNot(hasInstVar))
+            createAnd(args.filterNot(hasInstVar))
+          case l : Let =>  // checks if the body of the let can be deconstructed as And
+            val (letsCons, letsBody) = letStarUnapply(l)
+            letsBody match {
+              case And(args) =>
+                letsCons(createAnd(args.filterNot(hasInstVar)))
+              case _ => Util.tru
+            }
           case e => Util.tru
         }
         (fd -> Lambda(resdef, npost))
@@ -212,6 +233,9 @@ object LazinessEliminationPhase extends TransformationPhase {
     val ctx = Main.processOptions(Seq("--solvers=smt-cvc4,smt-z3", "--assumepre") ++ solverOptions ++ functions)
     val report = VerificationPhase.apply(ctx, prog)
     println(report.summaryString)
+    /*ctx.reporter.whenDebug(leon.utils.DebugSectionTimers) { debug =>
+        ctx.timers.outputTable(debug)
+      }*/
   }
 
   def checkInstrumentationSpecs(p: Program) = {
@@ -231,7 +255,8 @@ object LazinessEliminationPhase extends TransformationPhase {
 
     val vcs = p.definedFunctions.collect {
       // skipping verification of uninterpreted functions
-      case fd if !fd.isLibrary && fd.hasBody && fd.postcondition.exists{post =>
+      case fd if !fd.isLibrary && InstUtil.isInstrumented(fd) && fd.hasBody &&
+      fd.postcondition.exists{post =>
         val Lambda(Seq(resdef), pbody) =  post
         accessesSecondRes(pbody, resdef.id)
       } =>
@@ -240,10 +265,20 @@ object LazinessEliminationPhase extends TransformationPhase {
           case And(args) =>
             val (instSpecs, rest) = args.partition(accessesSecondRes(_, resdef.id))
             (createAnd(instSpecs), createAnd(rest))
+          case l: Let =>
+            val (letsCons, letsBody) = letStarUnapply(l)
+            letsBody match {
+              case And(args) =>
+                val (instSpecs, rest) = args.partition(accessesSecondRes(_, resdef.id))
+                (letsCons(createAnd(instSpecs)),
+                    letsCons(createAnd(rest)))
+              case _ =>
+                (l, Util.tru)
+            }
           case e => (e, Util.tru)
         }
-        val vc = implies(createAnd(Seq(fd.precOrTrue, assumptions)),
-            application(Lambda(Seq(resdef), instPost), Seq(fd.body.get)))
+        val vc = implies(createAnd(Seq(fd.precOrTrue, assumptions,
+            Equals(resdef.id.toVariable, fd.body.get))), instPost)
         if(debugInstVCs)
           println(s"VC for function ${fd.id} : "+vc)
         VC(vc, fd, VCKinds.Postcondition)
diff --git a/src/main/scala/leon/laziness/LazinessUtil.scala b/src/main/scala/leon/laziness/LazinessUtil.scala
index a2c2338ca2ba00e84e234f3ebb933753f29e64b1..80a43f648d5fa4f70e7bb3b777193fd1ef8d2ce2 100644
--- a/src/main/scala/leon/laziness/LazinessUtil.scala
+++ b/src/main/scala/leon/laziness/LazinessUtil.scala
@@ -24,10 +24,11 @@ import leon.LeonContext
 import leon.LeonOptionDef
 import leon.Main
 import leon.TransformationPhase
+import purescala.PrinterOptions
 
 object LazinessUtil {
 
-  def prettyPrintProgramToFile(p: Program, ctx: LeonContext) {
+  def prettyPrintProgramToFile(p: Program, ctx: LeonContext, suffix: String) {
     val optOutputDirectory = new LeonOptionDef[String] {
       val name = "o"
       val description = "Output directory"
@@ -44,12 +45,14 @@ object LazinessUtil {
     }
 
     for (u <- p.units if u.isMainUnit) {
-      val outputFile = s"$outputFolder${File.separator}${u.id.toString}.scala"
+      val outputFile = s"$outputFolder${File.separator}${u.id.toString}$suffix.scala"
       try {
         val out = new BufferedWriter(new FileWriter(outputFile))
         // remove '@' from the end of the identifier names
-        val pat = new Regex("""(\w+)([@*]+)(\w*)""", "base", "AtorStar", "rest")
-        val pgmText = pat.replaceAllIn(ScalaPrinter.apply(p), m => m.group("base") + m.group("rest"))
+        val pat = new Regex("""(\w+)(@)(\w*)(\*?)(\S*)""", "base", "at", "mid", "star", "rest")
+        val pgmText = pat.replaceAllIn(ScalaPrinter.apply(p),
+          m => m.group("base") + m.group("mid") + (
+            if (!m.group("star").isEmpty()) "S" else "") + m.group("rest"))
         out.write(pgmText)
         out.close()
       } catch {
@@ -73,12 +76,44 @@ object LazinessUtil {
       false
   }
 
+  def isInStateCall(e: Expr)(implicit p: Program): Boolean = e match {
+    case FunctionInvocation(TypedFunDef(fd, _), Seq()) =>
+      fullName(fd)(p) == "leon.lazyeval.$.inState"
+    case _ =>
+      false
+  }
+
+  def isOutStateCall(e: Expr)(implicit p: Program): Boolean = e match {
+    case FunctionInvocation(TypedFunDef(fd, _), Seq()) =>
+      fullName(fd)(p) == "leon.lazyeval.$.outState"
+    case _ =>
+      false
+  }
+
   def isEvaluatedInvocation(e: Expr)(implicit p: Program): Boolean = e match {
     case FunctionInvocation(TypedFunDef(fd, _), Seq(_)) =>
       fullName(fd)(p) == "leon.lazyeval.$.isEvaluated"
     case _ => false
   }
 
+  def isSuspInvocation(e: Expr)(implicit p: Program): Boolean = e match {
+    case FunctionInvocation(TypedFunDef(fd, _), Seq(_, _)) =>
+      fullName(fd)(p) == "leon.lazyeval.$.isSuspension"
+    case _ => false
+  }
+
+  def isWithStateCons(e: Expr)(implicit p: Program): Boolean = e match {
+    case CaseClass(cct, Seq(_)) =>
+      fullName(cct.classDef)(p) == "leon.lazyeval.$.WithState"
+    case _ => false
+  }
+
+  def isWithStateFun(e: Expr)(implicit p: Program): Boolean = e match {
+    case FunctionInvocation(TypedFunDef(fd, _), Seq(_, _)) =>
+      fullName(fd)(p) == "leon.lazyeval.WithState.withState"
+    case _ => false
+  }
+
   def isValueInvocation(e: Expr)(implicit p: Program): Boolean = e match {
     case FunctionInvocation(TypedFunDef(fd, _), Seq(_)) =>
       fullName(fd)(p) == "leon.lazyeval.$.value"
diff --git a/src/main/scala/leon/laziness/LazyClosureConverter.scala b/src/main/scala/leon/laziness/LazyClosureConverter.scala
index f4c81eb3d81c9a8d34f2ed27556ab42ec5fcaa23..4075adf96e94233c37fa761ada5efbba51d78637 100644
--- a/src/main/scala/leon/laziness/LazyClosureConverter.scala
+++ b/src/main/scala/leon/laziness/LazyClosureConverter.scala
@@ -25,7 +25,8 @@ import leon.LeonOptionDef
 import leon.Main
 import leon.TransformationPhase
 import LazinessUtil._
-import invariant.util.ProgramUtil._
+import ProgramUtil._
+import PredicateUtil._
 import purescala.TypeOps._
 
 /**
@@ -38,7 +39,7 @@ import purescala.TypeOps._
 class LazyClosureConverter(p: Program, ctx: LeonContext, closureFactory: LazyClosureFactory) {
   val debug = false
   // flags
-  val removeRecursionViaEval = false
+  //val removeRecursionViaEval = false
   val useRefEquality = ctx.findOption(LazinessEliminationPhase.optRefEquality).getOrElse(false)
 
   val funsManager = new LazyFunctionsManager(p)
@@ -83,15 +84,44 @@ class LazyClosureConverter(p: Program, ctx: LeonContext, closureFactory: LazyClo
       (fd -> nfd)
   }.toMap
 
+  // were used for optimization purposes
+  //  lazy val uiTarges = funMap.collect {
+  //    case (k, v) if closureFactory.isLazyOp(k) =>
+  //      val ufd = new FunDef(FreshIdentifier(v.id.name, v.id.getType, true),
+  //        v.tparams, v.params, v.returnType)
+  //      (k -> ufd)
+  //  }.toMap
+
   /**
-   * A set of uninterpreted functions that may be used as targets
-   * of closures in the eval functions, for efficiency reasons.
+   * A set of uninterpreted functions that are used
+   * in specs to ensure that value part is independent of the state
    */
-  lazy val uninterpretedTargets = funMap.collect {
-    case (k, v) if closureFactory.isLazyOp(k) =>
-      val ufd = new FunDef(FreshIdentifier(v.id.name, v.id.getType, true),
-        v.tparams, v.params, v.returnType)
-      (k -> ufd)
+  val uiFuncs = funMap.collect {
+    case (k, v) if funsNeedStates(k) &&
+      funsManager.hasStateIndependentBehavior(k) => //TODO: we can omit come functions whose translations will not be recursive.
+
+      val params = v.params.take(k.params.size) // ignore the state params
+      val retType =
+        if (funsRetStates(k)) {
+          val TupleType(valtype +: _) = v.returnType
+          valtype
+        } else v.returnType
+      val tparams = (params.map(_.getType) :+ retType).flatMap(getTypeParameters(_)).distinct
+      val tparamsDef =  tparams.map(TypeParameterDef(_))
+      val ufd = new FunDef(FreshIdentifier(v.id.name + "UI"), tparamsDef, params, retType)
+
+      // we also need to create a function that assumes that the result equals
+      // the uninterpreted function
+      val valres = ValDef(FreshIdentifier("valres", retType))
+      val pred = new FunDef(FreshIdentifier(v.id.name + "ValPred"), tparamsDef,
+          params :+ valres, BooleanType)
+      val resid = FreshIdentifier("res", pred.returnType)
+      pred.fullBody = Ensuring(
+          Equals(valres.id.toVariable,
+              FunctionInvocation(TypedFunDef(ufd, tparams), params.map(_.id.toVariable))), // res = funUI(..)
+              Lambda(Seq(ValDef(resid)), resid.toVariable)) // holds
+      pred.addFlag(Annotation("axiom", Seq())) // mark it as @library
+      (k -> (ufd, pred))
   }.toMap
 
   /**
@@ -171,36 +201,22 @@ class LazyClosureConverter(p: Program, ctx: LeonContext, closureFactory: LazyClo
       // create a body of the match
       val args = cdef.fields map { fld => CaseClassSelector(ctype, binder.toVariable, fld.id) }
       val op = closureFactory.caseClassToOp(cdef)
-      val targetFun =
-        if (removeRecursionViaEval && op.hasPostcondition) {
-          // checking for postcondition is a hack to make sure that we
-          // do not make all targets uninterpreted
-          uninterpretedTargets(op)
-        } else funMap(op)
-      // TODO: here we are assuming that only one state is used, fix this.
-
-      //(a) construct the value component
-      val stArgs =
-        if (funsNeedStates(op)) {
-          // Note: it is important to use uninterpreted state here for 2 reasons:
-          // (a) to eliminate dependency on state for the result value
-          // (b) to avoid inconsistencies when using a fixed state such as empty state
-          Seq(getUninterpretedState(tname, tparams))
-        } else Seq()
-      //println(s"invoking function $targetFun with args $args")
-      val invoke = FunctionInvocation(TypedFunDef(targetFun, tparams), args ++ stArgs) // we assume that the type parameters of lazy ops are same as absdefs
-      val invPart = if (funsRetStates(op)) {
-        TupleSelect(invoke, 1) // we are only interested in the value
-      } else invoke
-
-      //(b) construct the state component
-      val stPart = if (funsRetStates(op)) {
-        val stInvoke = FunctionInvocation(TypedFunDef(targetFun, tparams),
+      val targetFun = funMap(op)
+      // invoke the target fun with appropriate values
+       val invoke = FunctionInvocation(TypedFunDef(targetFun, tparams),
           args ++ (if (funsNeedStates(op)) Seq(param2.toVariable) else Seq()))
-        TupleSelect(stInvoke, 2)
-      } else param2.toVariable
-      val newst = SetUnion(stPart, FiniteSet(Set(binder.toVariable), stType.base))
-      val rhs = Tuple(Seq(invPart, newst))
+       val invokeRes = FreshIdentifier("dres", invoke.getType)
+      //println(s"invoking function $targetFun with args $args")
+      val (valPart, stPart) = if (funsRetStates(op)) {
+        // TODO: here we are assuming that only one state is used, fix this.
+       val invokeSt = TupleSelect(invokeRes.toVariable, 2)
+        (TupleSelect(invokeRes.toVariable, 1),
+            SetUnion(invokeSt, FiniteSet(Set(binder.toVariable), stType.base)))
+      } else {
+        (invokeRes.toVariable,
+            SetUnion(param2.toVariable, FiniteSet(Set(binder.toVariable), stType.base)))
+      }
+      val rhs = Let(invokeRes, invoke, Tuple(Seq(valPart, stPart)))
       MatchCase(pattern, None, rhs)
     }
     // create a new match case for eager evaluation
@@ -298,6 +314,46 @@ class LazyClosureConverter(p: Program, ctx: LeonContext, closureFactory: LazyClo
       }, false)
       mapNAryOperator(args, op)
 
+    case finv @ FunctionInvocation(_, Seq(recvr, funcArg)) if isSuspInvocation(finv)(p) =>
+      ((st: Map[String, Expr]) => {
+        // `funcArg` is a closure whose body is a function invocation
+        //TODO: make sure the function is not partially applied in the body
+        funcArg match {
+          case Lambda(_, FunctionInvocation(TypedFunDef(fd, _), _)) =>
+            // retrieve the case-class for the operation from the factory
+            val caseClass = closureFactory.closureOfLazyOp(fd)
+            val targs = TypeUtil.getTypeArguments(unwrapLazyType(recvr.getType).get)
+            val caseClassType = CaseClassType(caseClass, targs)
+            IsInstanceOf(recvr, caseClassType)
+          case _ =>
+            throw new IllegalArgumentException("The argument to isSuspension should be " +
+              "a partially applied function of the form: <method-name> _")
+        }
+      }, false)
+
+    case finv @ FunctionInvocation(_, Seq(recvr, stArg)) if isWithStateFun(finv)(p) =>
+      // recvr is a `WithStateCaseClass` and `stArg` could be an arbitrary expression that returns a state
+      val CaseClass(_, Seq(exprNeedingState)) = recvr
+      val (nexpr, exprReturnsState) = mapBody(exprNeedingState)
+      val (nstArg, stArgReturnsState) = mapBody(stArg)
+      if(stArgReturnsState)
+        throw new IllegalStateException("The state argument to `withState` returns a new state, which is not supported: "+finv)
+      else {
+        ((st: Map[String, Expr]) => {
+          val nst = nstArg(st)
+          // compute the baseType
+          stArg.getType match {
+            case SetType(lazyType) =>
+              val baseType = unwrapLazyType(lazyType).get
+              val tname = typeNameWOParams(baseType)
+              val newStates = st + (tname -> nst)
+              nexpr(newStates)
+            case t =>
+              throw new IllegalStateException(s"$stArg should have a set type, current type: "+t)
+          }
+        }, exprReturnsState)
+      }
+
     case finv @ FunctionInvocation(_, args) if isValueInvocation(finv)(p) => // is  value function ?
       val op = (nargs: Seq[Expr]) => ((st: Map[String, Expr]) => {
         val baseType = unwrapLazyType(nargs(0).getType).get // there must be only one argument here
@@ -305,7 +361,7 @@ class LazyClosureConverter(p: Program, ctx: LeonContext, closureFactory: LazyClo
         val dispFun = evalFunctions(tname)
         val dispFunInv = FunctionInvocation(TypedFunDef(dispFun,
           getTypeParameters(baseType)), nargs :+ st(tname))
-        val dispRes = FreshIdentifier("dres", dispFun.returnType)
+        val dispRes = FreshIdentifier("dres", dispFun.returnType, true)
         val nstates = tnames map {
           case `tname` =>
             TupleSelect(dispRes.toVariable, 2)
@@ -499,26 +555,89 @@ class LazyClosureConverter(p: Program, ctx: LeonContext, closureFactory: LazyClo
           else Some(npreFun(initStateMap))
       }
 
-      val simpPost = if (fd.hasPostcondition) {
-        val Lambda(arg @ Seq(ValDef(resid, _)), post) = fd.postcondition.get
-        val (npostFun, postUpdatesState) = mapBody(post)
-        val newres = FreshIdentifier(resid.name, bodyWithState.getType)
-        val npost1 =
-          if (bodyUpdatesState || funsRetStates(fd)) {
-            val stmap = tnames.zipWithIndex.map {
-              case (tn, i) => (tn -> TupleSelect(newres.toVariable, i + 2))
-            }.toMap
-            replace(Map(resid.toVariable -> TupleSelect(newres.toVariable, 1)), npostFun(stmap))
-          } else
-            replace(Map(resid.toVariable -> newres.toVariable), npostFun(initStateMap))
-        val npost =
-          if (postUpdatesState) {
-            TupleSelect(npost1, 1) // ignore state updated by post
-          } else npost1
-        Some(Lambda(Seq(ValDef(newres)), npost))
-      } else None
+      // create a new result variable
+      val newres =
+        if (fd.hasPostcondition) {
+          val Lambda(Seq(ValDef(r, _)), _) = fd.postcondition.get
+          FreshIdentifier(r.name, bodyWithState.getType)
+        } else FreshIdentifier("r", nfd.returnType)
+
+      // create an output state map
+      val outStateMap =
+        if (bodyUpdatesState || funsRetStates(fd)) {
+          tnames.zipWithIndex.map {
+            case (tn, i) => (tn -> TupleSelect(newres.toVariable, i + 2))
+          }.toMap
+        } else
+          initStateMap
+
+      // create a specification that relates input-output states
+      val stateRel =
+        if (funsRetStates(fd)) { // add specs on states
+          val instates = initStateMap.values.toSeq
+          val outstates = outStateMap.values.toSeq
+          val stateRel =
+            if(fd.annotations.contains("invstate")) Equals.apply _
+            else SubsetOf.apply _
+          Some(createAnd((instates zip outstates).map(p => stateRel(p._1, p._2))))
+        } else None
+
+      // create a predicate that ensures that the value part is independent of the state
+      val valRel =
+        if (uiFuncs.contains(fd)) { // add specs on value
+          val uipred = uiFuncs(fd)._2
+          val args = nfd.params.take(fd.params.size).map(_.id.toVariable)
+          val retarg =
+            if(funsRetStates(fd))
+              TupleSelect(newres.toVariable, 1)
+              else newres.toVariable
+          Some(FunctionInvocation(TypedFunDef(uipred, nfd.tparams.map(_.tp)),
+              args :+ retarg))
+        } else None
+
+      val targetPost =
+        if (fd.hasPostcondition) {
+          val Lambda(Seq(ValDef(resid, _)), post) = fd.postcondition.get
+          // bind calls to instate and outstate calls to their respective values
+          val tpost = simplePostTransform {
+            case e if LazinessUtil.isInStateCall(e)(p) =>
+              val baseType = getTypeArguments(e.getType).head
+              initStateMap(typeNameWOParams(baseType))
+
+            case e if LazinessUtil.isOutStateCall(e)(p) =>
+              val baseType = getTypeArguments(e.getType).head
+              outStateMap(typeNameWOParams(baseType))
+
+            case e => e
+          }(post)
+          // thread state through postcondition
+          val (npostFun, postUpdatesState) = mapBody(tpost)
+          val resval =
+            if (bodyUpdatesState || funsRetStates(fd))
+              TupleSelect(newres.toVariable, 1)
+            else newres.toVariable
+          val npostWithState = replace(Map(resid.toVariable -> resval), npostFun(outStateMap))
+          val npost =
+            if (postUpdatesState) {
+              TupleSelect(npostWithState, 1) // ignore state updated by post
+            } else
+              npostWithState
+          Some(npost)
+        } else {
+          None
+        }
+      nfd.postcondition = Some(Lambda(Seq(ValDef(newres)),
+          createAnd(stateRel.toList ++ valRel.toList ++ targetPost.toList)))
+//      if (removeRecursionViaEval) {
+//        uninterpretedTargets.get(fd) match {
+//          case Some(uitarget) =>
+//            // uitarget uses the same identifiers as nfd
+//            uitarget.postcondition = targetPost
+//          case None => ;
+//        }
+//      }
       // add invariants on state
-      val fpost =
+      /*val fpost =
         if (funsRetStates(fd)) { // add specs on states
           val instates = stateParams
           val resid = if (fd.hasPostcondition) {
@@ -538,59 +657,32 @@ class LazyClosureConverter(p: Program, ctx: LeonContext, closureFactory: LazyClo
             And(p, statePred)
           } else statePred))
           Some(post)
-        } else simpPost
-      if (fpost.isDefined) {
-        nfd.postcondition = fpost
+        } else simpPost*/
+      //if (fpost.isDefined) {
         // also attach postcondition to uninterpreted targets
-        if (removeRecursionViaEval) {
-          uninterpretedTargets.get(fd) match {
-            case Some(uitarget) =>
-              /*val nfdRes = fpost.get.args(0).id
-              val repmap: Map[Expr, Expr] = ((nfdRes.toVariable, FreshIdentifier(nfdRes.name).toVariable) +:
-                (nfd.params.map(_.id.toVariable) zip uitarget.params.map(_.id.toVariable))).toMap*/
-              // uitarget uses the same identifiers as nfd
-              uitarget.postcondition = fpost
-            case None => ;
-          }
-        }
-      }
+      //}
   }
 
-  /**
-   * This method is not used for now,
-   */
   def assignContractsForEvals = evalFunctions.foreach {
     case (tname, evalfd) =>
       val cdefs = closureFactory.closures(tname)
       val tparams = evalfd.tparams.map(_.tp)
       val postres = FreshIdentifier("res", evalfd.returnType)
-      val postMatchCases = cdefs map { cdef =>
-        val ctype = CaseClassType(cdef, tparams)
-        val binder = FreshIdentifier("t", ctype)
-        val pattern = InstanceOfPattern(Some(binder), ctype)
-        // create a body of the match
+      val postMatchCases = cdefs flatMap { cdef =>
+        // create a body of the match (which asserts that return value equals the uninterpreted function)
         val op = closureFactory.lazyopOfClosure(cdef)
-        val targetFd = funMap(op)
-        val rhs = if (targetFd.hasPostcondition) {
-          val Lambda(Seq(resparam), targetPost) = targetFd.postcondition.get
+        if (uiFuncs.contains(op)) {
+          val ctype = CaseClassType(cdef, tparams)
+          val binder = FreshIdentifier("t", ctype)
+          val pattern = InstanceOfPattern(Some(binder), ctype)
           val args = cdef.fields map { fld => CaseClassSelector(ctype, binder.toVariable, fld.id) }
-          val stArgs =
-            if (funsNeedStates(op)) // TODO: here we are assuming that only one state is used, fix this.
-              Seq(evalfd.params.last.toVariable)
-            else Seq()
-          val resarg =
-            if (funsRetStates(op))
-              postres.toVariable
-            else
-              TupleSelect(postres.toVariable, 1) // here 'targetFd' does not return state, but eval does
-          val replaceMap = (targetFd.params.map(_.toVariable) zip (args ++ stArgs)).toMap[Expr, Expr] +
-            (resparam.toVariable -> resarg)
-          replace(replaceMap, targetPost)
-        } else
-          Util.tru
-        MatchCase(pattern, None, rhs)
+          val rhs = Equals(TupleSelect(postres.toVariable, 1),
+            FunctionInvocation(TypedFunDef(uiFuncs(op)._1, tparams), args)
+            )
+          Seq(MatchCase(pattern, None, rhs))
+        } else Seq()
       }
-      // create a default case to match eagerClosures
+      // create a default case ot match other cases
       val default = MatchCase(WildcardPattern(None), None, Util.tru)
       evalfd.postcondition = Some(
         Lambda(Seq(ValDef(postres)),
@@ -623,18 +715,16 @@ class LazyClosureConverter(p: Program, ctx: LeonContext, closureFactory: LazyClo
     val anchor = funMap.values.last
     transformCaseClasses
     assignBodiesToFunctions
-    //assignContractsForEvals
+    assignContractsForEvals
     addDefs(
       copyProgram(p,
         (defs: Seq[Definition]) => defs.flatMap {
           case fd: FunDef if funMap.contains(fd) =>
-            if (removeRecursionViaEval) {
-              uninterpretedTargets.get(fd) match {
-                case Some(uitarget) =>
-                  Seq(funMap(fd), uitarget)
-                case _ => Seq(funMap(fd))
-              }
-            } else Seq(funMap(fd))
+            uiFuncs.get(fd) match {
+              case Some((funui, predui)) =>
+                Seq(funMap(fd), funui, predui)
+              case _ => Seq(funMap(fd))
+            }
           case d => Seq(d)
         }),
       closureFactory.allClosuresAndParents ++ closureCons.values ++
diff --git a/src/main/scala/leon/laziness/LazyFunctionsManager.scala b/src/main/scala/leon/laziness/LazyFunctionsManager.scala
new file mode 100644
index 0000000000000000000000000000000000000000..1aa4f5a4b0a627fe1f2c8f7645807560cc3622cd
--- /dev/null
+++ b/src/main/scala/leon/laziness/LazyFunctionsManager.scala
@@ -0,0 +1,117 @@
+package leon
+package laziness
+
+import invariant.factories._
+import invariant.util.Util._
+import invariant.util._
+import invariant.structure.FunctionUtils._
+import purescala.ScalaPrinter
+import purescala.Common._
+import purescala.Definitions._
+import purescala.Expressions._
+import purescala.ExprOps._
+import purescala.DefOps._
+import purescala.Extractors._
+import purescala.Types._
+import leon.invariant.util.TypeUtil._
+import leon.invariant.util.LetTupleSimplification._
+import java.io.File
+import java.io.FileWriter
+import java.io.BufferedWriter
+import scala.util.matching.Regex
+import leon.purescala.PrettyPrinter
+import leon.LeonContext
+import leon.LeonOptionDef
+import leon.Main
+import leon.TransformationPhase
+import LazinessUtil._
+
+class LazyFunctionsManager(p: Program) {
+
+  // includes calls made through the specs
+  val cg = CallGraphUtil.constructCallGraph(p, false, true,
+    // a specialized callee function that ignores functions called inside `withState` calls, because they would have state as an argument
+    (inexpr: Expr) => {
+      var callees = Set[FunDef]()
+      def rec(e: Expr): Unit = e match {
+        case cc @ CaseClass(_, args) if LazinessUtil.isWithStateCons(cc)(p) =>
+          ; //nothing to be done
+        case f : FunctionInvocation if LazinessUtil.isSuspInvocation(f)(p) =>
+          // we can ignore the arguments to susp invocation as they are not actual calls, but only a test
+          ;
+        //note: do not consider field invocations
+        case f @ FunctionInvocation(TypedFunDef(callee, _), args) if callee.isRealFunction =>
+          callees += callee
+          args map rec
+        case Operator(args, _) => args map rec
+      }
+      rec(inexpr)
+      callees
+    })
+
+  val (funsNeedStates, funsRetStates) = {
+    var needRoots = Set[FunDef]()
+    var retRoots = Set[FunDef]()
+    p.definedFunctions.foreach {
+      case fd if fd.hasBody =>
+        postTraversal {
+          case finv: FunctionInvocation if isLazyInvocation(finv)(p) =>
+            // the lazy invocation constructor will need the state
+            needRoots += fd
+          case finv: FunctionInvocation if isEvaluatedInvocation(finv)(p) =>
+            needRoots += fd
+          case finv: FunctionInvocation if isValueInvocation(finv)(p) =>
+            needRoots += fd
+            retRoots += fd
+          case _ =>
+            ;
+        }(fd.body.get)
+      case _ => ;
+    }
+    val retfuns = cg.transitiveCallers(retRoots.toSeq)
+    //println("Ret roots: "+retRoots.map(_.id)+" ret funs: "+retfuns.map(_.id))
+    (cg.transitiveCallers(needRoots.toSeq), retfuns)
+  }
+
+  lazy val callersOfLazyCons = {
+    var consRoots = Set[FunDef]()
+    funsNeedStates.foreach {
+      case fd if fd.hasBody =>
+        postTraversal {
+          case finv: FunctionInvocation if isLazyInvocation(finv)(p) => // this is the lazy invocation constructor
+            consRoots += fd
+          case _ =>
+            ;
+        }(fd.body.get)
+      case _ => ;
+    }
+    cg.transitiveCallers(consRoots.toSeq)
+  }
+
+  lazy val cgWithoutSpecs = CallGraphUtil.constructCallGraph(p, true, false)
+  lazy val callersOfIsEvalandIsSusp = {
+    var roots = Set[FunDef]()
+    funsNeedStates.foreach {
+      case fd if fd.hasBody =>
+        postTraversal {
+          case finv: FunctionInvocation if
+            isEvaluatedInvocation(finv)(p) || isSuspInvocation(finv)(p) => // call to isEvaluated || isSusp ?
+            roots += fd
+          case _ =>
+            ;
+        }(fd.body.get)
+      case _ => ;
+    }
+    cgWithoutSpecs.transitiveCallers(roots.toSeq)
+  }
+
+  def isRecursive(fd: FunDef) : Boolean = {
+    cg.isRecursive(fd)
+  }
+
+  def hasStateIndependentBehavior(fd: FunDef) : Boolean = {
+    // every function that does not call isEvaluated or is Susp has a state independent behavior
+    !callersOfIsEvalandIsSusp.contains(fd)
+  }
+
+}
diff --git a/src/main/scala/leon/laziness/LazyInstrumenter.scala b/src/main/scala/leon/laziness/LazyInstrumenter.scala
index f72b85b88a88e20f507f89a61ddcd05bb7ab02af..9d2803a45f3f3266106242a6d5dfd509a7fe8262 100644
--- a/src/main/scala/leon/laziness/LazyInstrumenter.scala
+++ b/src/main/scala/leon/laziness/LazyInstrumenter.scala
@@ -54,17 +54,17 @@ class LazyInstrumenter(p: Program) {
         val nbody = e match {
           case MatchExpr(scr, mcases) =>
             val ncases = mcases map {
-              case MatchCase(pat, guard, Tuple(Seq(valpart, statepart))) =>
+              case MatchCase(pat, guard, body) =>
                 // instrument the state part (and ignore the val part)
                 // (Note: this is an hack to ensure that we always consider only one call to targets)
-                val transState = transform(statepart)(Map())
+                /*val transState = transform(statepart)(Map())
                 val transVal = transform(valpart)(Map())
 
                 val caseId = FreshIdentifier("cd", transState.getType, true)
                 val casePart = Tuple(Seq(TupleSelect(transVal, 1), TupleSelect(caseId.toVariable, 1)))
                 val instPart = instrumenters map { m => selectInst(caseId.toVariable, m.inst) }
-                val lete = Let(caseId, transState, Tuple(casePart +: instPart))
-                MatchCase(pat, guard, lete)
+                val lete = Let(caseId, transState, Tuple(casePart +: instPart))*/
+                MatchCase(pat, guard, transform(body)(Map()))
             }
             MatchExpr(scr, ncases)
         }
diff --git a/src/main/scala/leon/laziness/RefDataTypeManager.scala b/src/main/scala/leon/laziness/RefDataTypeManager.scala
new file mode 100644
index 0000000000000000000000000000000000000000..6a6ac9d8ac70f2238c5d835a27db44732d95d2c2
--- /dev/null
+++ b/src/main/scala/leon/laziness/RefDataTypeManager.scala
@@ -0,0 +1,42 @@
+package leon
+package laziness
+
+import invariant.factories._
+import invariant.util.Util._
+import invariant.util._
+import invariant.structure.FunctionUtils._
+import purescala.ScalaPrinter
+import purescala.Common._
+import purescala.Definitions._
+import purescala.Expressions._
+import purescala.ExprOps._
+import purescala.DefOps._
+import purescala.Extractors._
+import purescala.Types._
+import leon.invariant.util.TypeUtil._
+import leon.invariant.util.LetTupleSimplification._
+import java.io.File
+import java.io.FileWriter
+import java.io.BufferedWriter
+import scala.util.matching.Regex
+import leon.purescala.PrettyPrinter
+import leon.LeonContext
+import leon.LeonOptionDef
+import leon.Main
+import leon.TransformationPhase
+import LazinessUtil._
+import invariant.util.ProgramUtil._
+
+/**
+ * A class that maintains a data type that can be
+ * used to create unique references
+ */
+/*class RefDataTypeManager(p: Program) {
+
+  lazy val valueType = CaseClassType(CaseClassDef(FreshIdentifier("Unit"), Seq(), None, false), Seq())
+  lazy val refStreamDef = {
+    val cd = CaseClassDef(FreshIdentifier("Ref"), Seq(), None, false)
+    cd.setFields(Seq(ValDef(FreshIdentifier("data", valueType)),
+        ValDef(FreshIdentifier("next", valueType)))
+  }
+}*/
diff --git a/src/main/scala/leon/laziness/TypeChecker.scala b/src/main/scala/leon/laziness/TypeChecker.scala
index 6bb093b738763eb1d1da7bcbd581355e1afd2f7e..978c33a367b7a8e33ee7e3a004454c9e61c8c601 100644
--- a/src/main/scala/leon/laziness/TypeChecker.scala
+++ b/src/main/scala/leon/laziness/TypeChecker.scala
@@ -8,6 +8,7 @@ import purescala.Expressions._
 import purescala.ExprOps._
 import purescala.Extractors._
 import purescala.Types._
+import purescala.TypeOps._
 import leon.invariant.util.TypeUtil._
 
 object TypeChecker {
@@ -111,6 +112,7 @@ object TypeChecker {
               MatchCase(npattern, nguard, nrhs)
           }
           val nmatch = MatchExpr(nscr, ncases)
+          //println("Old match expr: "+e+" \n new expr: "+nmatch)
           (nmatch.getType, nmatch)
 
         case cs @ CaseClassSelector(cltype, clExpr, fld) =>
@@ -118,7 +120,8 @@ object TypeChecker {
           // this is a hack. TODO: fix this
           subcast(cltype, ncltype) match {
             case Some(ntype : CaseClassType) =>
-              (ntype, CaseClassSelector(ntype, nclExpr, fld))
+              val nop = CaseClassSelector(ntype, nclExpr, fld)
+              (nop.getType, nop)
             case  _ =>
               throw new IllegalStateException(s"$nclExpr : $ncltype cannot be cast to case class type: $cltype")
           }
@@ -157,17 +160,28 @@ object TypeChecker {
               }
             }
           // for uninterpreted functions, we could have a type parameter used only in the return type
-          val ntparams = (fd.tparams zip tparams).map{
+          val dummyTParam = TypeParameter.fresh("R@")
+          val ntparams = fd.tparams.map(_.tp).zipAll(tparams, dummyTParam, dummyTParam).map{
             case (paramt, argt) =>
-              tpmap.getOrElse(paramt.tp /* in this case we inferred the type parameter */,
+              tpmap.getOrElse(paramt /* in this case we inferred the type parameter */,
                   argt /* in this case we reuse the argument type parameter */ )
           }
           val nexpr = FunctionInvocation(TypedFunDef(fd, ntparams), nargs)
           if (nexpr.getType == Untyped) {
-            throw new IllegalStateException(s"Cannot infer type for expression: $e arg types: ${nargs.map(_.getType).mkString(",")}")
+            throw new IllegalStateException(s"Cannot infer type for expression: $e "+
+                s"arg types: ${nargs.map(_.getType).mkString(",")} \n Callee: ${fd} \n caller: ${nexpr}")
           }
           (nexpr.getType, nexpr)
 
+        case FiniteSet(els, baseType) =>
+          val nels = els.map(rec(_)._2)
+          // make sure every element has the same type (upcast it to the rootType)
+          val nbaseType  = bestRealType(nels.head.getType)
+          if(!nels.forall(el => bestRealType(el.getType) == nbaseType))
+            throw new IllegalStateException("Not all elements in the set have the same type: "+nbaseType)
+          val nop = FiniteSet(nels, nbaseType)
+          (nop.getType, nop)
+
         // need to handle tuple select specially
         case TupleSelect(tup, i) =>
           val nop = TupleSelect(rec(tup)._2, i)
diff --git a/src/main/scala/leon/laziness/TypeRectifier.scala b/src/main/scala/leon/laziness/TypeRectifier.scala
index 89583bbb0b990d14463f8c888180cd70b59ebf34..a253e46c7ae0a4668c61489cfd0c15aa78344d3a 100644
--- a/src/main/scala/leon/laziness/TypeRectifier.scala
+++ b/src/main/scala/leon/laziness/TypeRectifier.scala
@@ -38,7 +38,13 @@ import invariant.util.ProgramUtil._
 class TypeRectifier(p: Program, placeHolderParameter: TypeParameter => Boolean) {
 
   val typeClasses = {
-    var tc = new DisjointSets[TypeTree]()
+    var tc = new DisjointSets[TypeTree]() /*{
+      override def union(x: TypeTree, y: TypeTree) {
+        if (!x.isInstanceOf[TypeParameter] || !y.isInstanceOf[TypeParameter])
+          throw new IllegalStateException(s"Unifying: $x and $y")
+        super.union(x, y)
+      }
+    }*/
     p.definedFunctions.foreach {
       case fd if fd.hasBody && !fd.isLibrary =>
         postTraversal {
@@ -50,7 +56,12 @@ class TypeRectifier(p: Program, placeHolderParameter: TypeParameter => Boolean)
             (fd.params zip args).foreach { x =>
               (x._1.getType, x._2.getType) match {
                 case (SetType(tf: ClassType), SetType(ta: ClassType)) =>
-                  (tf.tps zip ta.tps).foreach { x => tc.union(x._1, x._2) }
+                  // unify only type parameters
+                  (tf.tps zip ta.tps).foreach {
+                    case (t1: TypeParameter, t2: TypeParameter) =>
+                      tc.union(t1, t2)
+                    case _ => ;
+                  }
                 case (tf: TypeParameter, ta: TypeParameter) =>
                   tc.union(tf, ta)
                 case (t1, t2) =>
@@ -113,7 +124,7 @@ class TypeRectifier(p: Program, placeHolderParameter: TypeParameter => Boolean)
     // the tupleExpr if it is not TupleTyped.
     // cannot use simplePostTransform because of this
     def rec(e: Expr): Expr = e match {
-      case FunctionInvocation(TypedFunDef(callee, targsOld), args) =>
+      case FunctionInvocation(TypedFunDef(callee, targsOld), args) => // this is already done by the type checker
         val targs = targsOld.map {
           case tp: TypeParameter => tpMap.getOrElse(tp, tp)
           case t                 => t
@@ -123,18 +134,34 @@ class TypeRectifier(p: Program, placeHolderParameter: TypeParameter => Boolean)
             fdMap(callee)._1
           else callee
         FunctionInvocation(TypedFunDef(ncallee, targs), args map rec)
+
+      case CaseClass(cct, args) =>
+        val targs = cct.tps.map {
+          case tp: TypeParameter => tpMap.getOrElse(tp, tp)
+          case t                 => t
+        }.distinct
+        CaseClass(CaseClassType(cct.classDef, targs), args map rec)
+
       case Variable(id) if paramMap.contains(id) =>
         paramMap(id).toVariable
-      case TupleSelect(tup, index) => TupleSelect(rec(tup), index)
+      case TupleSelect(tup, index) =>
+        TupleSelect(rec(tup), index)
       case Ensuring(NoTree(_), post) =>
         Ensuring(nfd.fullBody, rec(post)) // the newfd body would already be type correct
       case Operator(args, op)      => op(args map rec)
       case t: Terminal             => t
     }
     val nbody = rec(ifd.fullBody)
-    //println(s"Inferring types for ${ifd.id} new fun: $nfd new body: $nbody")
     val initGamma = nfd.params.map(vd => vd.id -> vd.getType).toMap
-    TypeChecker.inferTypesOfLocals(nbody, initGamma)
+    /*if(ifd.id.name.contains("pushLeftWrapper")) {
+      println(s"Inferring types for ${ifd.id}")
+    }*/
+    val typedBody = TypeChecker.inferTypesOfLocals(nbody, initGamma)
+    /*if(ifd.id.name.contains("pushLeftWrapper")) {
+      //println(s"Inferring types for ${ifd.id} new fun: $nfd \n old body: ${ifd.fullBody} \n type correct body: $typedBody")
+      System.exit(0)
+    }*/
+    typedBody
   }
 
   def apply: Program = {
diff --git a/src/main/scala/leon/solvers/combinators/UnrollingSolver.scala b/src/main/scala/leon/solvers/combinators/UnrollingSolver.scala
index 0c7afeb65199042e34004199d17bba25dde9e1c5..4f749a96c71db74efc033ccebf52f575e40b1836 100644
--- a/src/main/scala/leon/solvers/combinators/UnrollingSolver.scala
+++ b/src/main/scala/leon/solvers/combinators/UnrollingSolver.scala
@@ -13,37 +13,37 @@ import purescala.ExprOps._
 import purescala.Types._
 import utils._
 
-import z3.FairZ3Component.{optFeelingLucky, optUseCodeGen, optAssumePre}
+import z3.FairZ3Component.{ optFeelingLucky, optUseCodeGen, optAssumePre }
 import templates._
 import evaluators._
 
 class UnrollingSolver(val context: LeonContext, val program: Program, underlying: Solver)
-  extends Solver
-     with NaiveAssumptionSolver
-     with EvaluatingSolver
-     with QuantificationSolver {
+    extends Solver
+    with NaiveAssumptionSolver
+    with EvaluatingSolver
+    with QuantificationSolver {
 
-  val feelingLucky   = context.findOptionOrDefault(optFeelingLucky)
-  val useCodeGen     = context.findOptionOrDefault(optUseCodeGen)
+  val feelingLucky = context.findOptionOrDefault(optFeelingLucky)
+  val useCodeGen = context.findOptionOrDefault(optUseCodeGen)
   val assumePreHolds = context.findOptionOrDefault(optAssumePre)
 
-  protected var lastCheckResult : (Boolean, Option[Boolean], Option[HenkinModel]) = (false, None, None)
+  protected var lastCheckResult: (Boolean, Option[Boolean], Option[HenkinModel]) = (false, None, None)
 
-  private val freeVars    = new IncrementalSet[Identifier]()
+  private val freeVars = new IncrementalSet[Identifier]()
   private val constraints = new IncrementalSeq[Expr]()
 
-  protected var interrupted : Boolean = false
+  protected var interrupted: Boolean = false
 
   val reporter = context.reporter
 
-  def name = "U:"+underlying.name
+  def name = "U:" + underlying.name
 
   def free() {
     underlying.free()
   }
 
   val templateGenerator = new TemplateGenerator(new TemplateEncoder[Expr] {
-    def encodeId(id: Identifier): Expr= {
+    def encodeId(id: Identifier): Expr = {
       Variable(id.freshen)
     }
 
@@ -138,7 +138,7 @@ class UnrollingSolver(val context: LeonContext, val program: Program, underlying
           val ev: Expr = p._2 match {
             case RawArrayValue(_, mapping, dflt) =>
               mapping.collectFirst {
-                case (k,v) if evaluator.eval(Equals(k, tupleWrap(es))).result == Some(BooleanLiteral(true)) => v
+                case (k, v) if evaluator.eval(Equals(k, tupleWrap(es))).result == Some(BooleanLiteral(true)) => v
               } getOrElse dflt
             case _ => scala.sys.error("Unexpected function encoding " + p._2)
           }
@@ -176,14 +176,14 @@ class UnrollingSolver(val context: LeonContext, val program: Program, underlying
         false
 
       case Successful(e) =>
-        reporter.warning("- Model leads unexpected result: "+e)
+        reporter.warning("- Model leads unexpected result: " + e)
         false
 
       case RuntimeError(msg) =>
         reporter.debug("- Model leads to runtime error.")
         false
 
-      case EvaluatorError(msg) => 
+      case EvaluatorError(msg) =>
         if (silenceErrors) {
           reporter.debug("- Model leads to evaluator error: " + msg)
         } else {
@@ -196,7 +196,7 @@ class UnrollingSolver(val context: LeonContext, val program: Program, underlying
   def genericCheck(assumptions: Set[Expr]): Option[Boolean] = {
     lastCheckResult = (false, None, None)
 
-    while(!hasFoundAnswer && !interrupted) {
+    while (!hasFoundAnswer && !interrupted) {
       reporter.debug(" - Running search...")
 
       solver.push()
@@ -260,29 +260,33 @@ class UnrollingSolver(val context: LeonContext, val program: Program, underlying
             solver.pop()
           }
 
-          if(interrupted) {
+          if (interrupted) {
             foundAnswer(None)
           }
 
-          if(!hasFoundAnswer) {
+          if (!hasFoundAnswer) {
             reporter.debug("- We need to keep going.")
 
-            val toRelease = unrollingBank.getBlockersToUnlock
+            //for (i <- 1 to 3) {
+              val toRelease = unrollingBank.getBlockersToUnlock
 
-            reporter.debug(" - more unrollings")
+              reporter.debug(" - more unrollings")
 
-            val newClauses = unrollingBank.unrollBehind(toRelease)
+              // enclose within for loop
+              val newClauses = unrollingBank.unrollBehind(toRelease)
 
-            for(ncl <- newClauses) {
-              solver.assertCnstr(ncl)
-            }
+              for (ncl <- newClauses) {
+                solver.assertCnstr(ncl)
+              }
+            //}
+            // finish here
 
             reporter.debug(" - finished unrolling")
           }
       }
     }
 
-    if(interrupted) {
+    if (interrupted) {
       None
     } else {
       lastCheckResult._2
@@ -300,10 +304,10 @@ class UnrollingSolver(val context: LeonContext, val program: Program, underlying
 
   override def reset() = {
     underlying.reset()
-    lastCheckResult  = (false, None, None)
+    lastCheckResult = (false, None, None)
     freeVars.reset()
     constraints.reset()
-    interrupted      = false
+    interrupted = false
   }
 
   override def interrupt(): Unit = {
diff --git a/src/main/scala/leon/transformations/SerialInstrumentationPhase.scala b/src/main/scala/leon/transformations/SerialInstrumentationPhase.scala
index 33f5cacc280d6bab36c11da176cbbdd01c28861e..a8d1d8306c7cded60c3d8d30443fb1b2776ca08d 100644
--- a/src/main/scala/leon/transformations/SerialInstrumentationPhase.scala
+++ b/src/main/scala/leon/transformations/SerialInstrumentationPhase.scala
@@ -233,36 +233,37 @@ class ExprInstrumenter(funMap: Map[FunDef, FunDef], serialInst: SerialInstrument
           val finalRes = Tuple(t +: instPart)
           finalRes
 
-        case f @ FunctionInvocation(tfd, args) if tfd.fd.isRealFunction =>
-          val newfd = TypedFunDef(funMap(tfd.fd), tfd.tps)
-          val newFunInv = FunctionInvocation(newfd, subeVals)
-          //create a variables to store the result of function invocation
-          if (serialInst.instFuncs(tfd.fd)) {
-            //this function is also instrumented
-            val resvar = Variable(FreshIdentifier("e", newfd.returnType, true))
-            val valexpr = TupleSelect(resvar, 1)
-            val instexprs = instrumenters.map { m =>
-              val calleeInst = if (serialInst.funcInsts(tfd.fd).contains(m.inst)) {
-                List(serialInst.selectInst(tfd.fd)(resvar, m.inst))
-              } else List()
-              //Note we need to ensure that the last element of list is the instval of the finv
-              m.instrument(e, subeInsts.getOrElse(m.inst, List()) ++ calleeInst, Some(resvar))
-            }
-            Let(resvar.id, newFunInv, Tuple(valexpr +: instexprs))
-          } else {
-            val resvar = Variable(FreshIdentifier("e", newFunInv.getType, true))
-            val instexprs = instrumenters.map { m =>
-              m.instrument(e, subeInsts.getOrElse(m.inst, List()))
+        // TODO: We are ignoring the construction cost of fields. Fix this.
+        case f @ FunctionInvocation(TypedFunDef(fd, tps), args) =>
+          if (!fd.hasLazyFieldFlag) {
+            val newfd = TypedFunDef(funMap(fd), tps)
+            val newFunInv = FunctionInvocation(newfd, subeVals)
+            //create a variables to store the result of function invocation
+            if (serialInst.instFuncs(fd)) {
+              //this function is also instrumented
+              val resvar = Variable(FreshIdentifier("e", newfd.returnType, true))
+              val valexpr = TupleSelect(resvar, 1)
+              val instexprs = instrumenters.map { m =>
+                val calleeInst =
+                  if (serialInst.funcInsts(fd).contains(m.inst) &&
+                      fd.isUserFunction) {
+                    // ignoring fields here
+                    List(serialInst.selectInst(fd)(resvar, m.inst))
+                  } else List()
+                //Note we need to ensure that the last element of list is the instval of the finv
+                m.instrument(e, subeInsts.getOrElse(m.inst, List()) ++ calleeInst, Some(resvar))
+              }
+              Let(resvar.id, newFunInv, Tuple(valexpr +: instexprs))
+            } else {
+              val resvar = Variable(FreshIdentifier("e", newFunInv.getType, true))
+              val instexprs = instrumenters.map { m =>
+                m.instrument(e, subeInsts.getOrElse(m.inst, List()))
+              }
+              Let(resvar.id, newFunInv, Tuple(resvar +: instexprs))
             }
-            Let(resvar.id, newFunInv, Tuple(resvar +: instexprs))
-          }
-
-        // This case will be taken if the function invocation is actually a val (lazy or otherwise) in the class
-        case f @ FunctionInvocation(tfd, args) =>
-          val resvar = Variable(FreshIdentifier("e", tfd.fd.returnType, true))
-          val instPart = instrumenters map (_.instrument(f, Seq()))
-          val finalRes = Tuple(f +: instPart)
-          finalRes
+          } else
+            throw new UnsupportedOperationException("Lazy fields are not handled in instrumentation." +
+              " Consider using the --lazy option and rewrite your program using lazy constructor `$`")
 
         case _ =>
           val exprPart = recons(subeVals)
diff --git a/testcases/lazy-datastructures/BottomUpMegeSort.scala b/testcases/lazy-datastructures/BottomUpMegeSort.scala
index 23ae5e1e3c507fb22a0d908c1f8458f7638ad4ef..71c006399f467fe6e6c06cc8c6b7d05a621f4277 100644
--- a/testcases/lazy-datastructures/BottomUpMegeSort.scala
+++ b/testcases/lazy-datastructures/BottomUpMegeSort.scala
@@ -7,8 +7,15 @@ import leon.annotation._
 import leon.instrumentation._
 //import leon.invariant._
 
+/**
+ * TODO Multiple instantiations of type parameters is not supported yet,
+ * since it require creation of multiple states one for each instantiation
+ */
 object BottomUpMergeSort {
 
+  /**
+   * A list of integers that have to be sorted
+   */
   sealed abstract class IList {
     def size: BigInt = {
       this match {
@@ -20,124 +27,149 @@ object BottomUpMergeSort {
   case class ICons(x: BigInt, tail: IList) extends IList
   case class INil() extends IList
 
-  sealed abstract class ILList {
+  /**
+   * A stream of integers (the tail is lazy)
+   */
+  sealed abstract class IStream {
     def size: BigInt = {
       this match {
-        case LCons(_, xs) => 1 + ssize(xs)
+        case SCons(_, xs) => 1 + ssize(xs)
         case _            => BigInt(0)
       }
     } ensuring (_ >= 0)
   }
-  case class LCons(x: BigInt, tail: $[ILList]) extends ILList
-  case class LNil() extends ILList
-  def ssize(l: $[ILList]): BigInt = (l*).size
+  case class SCons(x: BigInt, tail: $[IStream]) extends IStream
+  case class SNil() extends IStream
+  def ssize(l: $[IStream]): BigInt = (l*).size
 
-  // TODO: making this parametric will break many things. Fix them
+  /**
+   * A list of lazy closures
+   */
   sealed abstract class LList {
     def size: BigInt = {
       this match {
-        case SNil()      => BigInt(0)
-        case SCons(_, t) => 1 + t.size
+        case LNil()      => BigInt(0)
+        case LCons(_, t) => 1 + t.size
       }
     } ensuring (_ >= 0)
 
     def valid: Boolean = {
       this match {
-        case SNil()      => true
-        case SCons(l, t) => ssize(l) > 0 && t.valid
+        case LNil()      => true
+        case LCons(l, t) => ssize(l) > 0 && t.valid
       }
     }
 
     def fullSize: BigInt = {
       this match {
-        case SNil()      => BigInt(0)
-        case SCons(l, t) => ssize(l) + t.fullSize
+        case LNil()      => BigInt(0)
+        case LCons(l, t) => ssize(l) + t.fullSize
       }
     } ensuring (_ >= 0)
   }
-  case class SCons(x: $[ILList], tail: LList) extends LList
-  case class SNil() extends LList
+  case class LCons(x: $[IStream], tail: LList) extends LList
+  case class LNil() extends LList
 
-  // takes O(n) time
+  /**
+   * A function that given a list of (lazy) sorted lists,
+   * groups them into pairs and lazily invokes the 'merge' function
+   * on each pair.
+   * Takes time linear in the size of the input list
+   */
   def pairs(l: LList): LList = {
     require(l.valid)
     l match {
-      case SNil()           => SNil()
-      case SCons(_, SNil()) => l
-      case SCons(l1, SCons(l2, rest)) =>
-        SCons($(merge(l1, l2)), pairs(rest))
+      case LNil()           => LNil()
+      case LCons(_, LNil()) => l
+      case LCons(l1, LCons(l2, rest)) =>
+        LCons($(merge(l1, l2)), pairs(rest))
     }
   } ensuring (res => res.size <= (l.size + 1) / 2 &&
     l.fullSize == res.fullSize &&
     res.valid &&
     time <= 10 * l.size + 4)
 
+  /**
+   * Create a linearized tree of merges e.g. merge(merge(2, 1), merge(17, 19)).
+   * Takes time linear in the size of the input list.
+   */
   def constructMergeTree(l: LList): LList = {
     require(l.valid)
     l match {
-      case SNil()           => SNil()
-      case SCons(_, SNil()) => l
+      case LNil()           => LNil()
+      case LCons(_, LNil()) => l
       case _ =>
         constructMergeTree(pairs(l))
     }
   } ensuring (res => res.size <= 1 && res.fullSize == l.fullSize &&
     (res match {
-      case SCons(il, SNil()) =>
+      case LCons(il, LNil()) =>
         res.fullSize == ssize(il) // this is implied by the previous conditions
       case _ => true
     }) &&
     res.valid &&
     time <= 42 * l.size + 4)
 
-  // here size of res is required to be >= 1
-  def merge(a: $[ILList], b: $[ILList]): ILList = {
-    require(((a*) != LNil() || b.isEvaluated) && // if one of them is Nil then the other is evaluated
-        ((b*) != LNil() || a.isEvaluated) &&
-        ((a*) != LNil() || (b*) != LNil())) // one of them is not Nil
+  /**
+   *  A function that merges two sorted streams of integers.
+   *  Note: the sorted stream of integers may by recursively constructed using merge.
+   *  Takes time linear in the size of the streams (non-trivial to prove due to cascading of lazy calls)
+   */
+  def merge(a: $[IStream], b: $[IStream]): IStream = {
+    require(((a*) != SNil() || b.isEvaluated) && // if one of the arguments is Nil then the other is evaluated
+        ((b*) != SNil() || a.isEvaluated) &&
+        ((a*) != SNil() || (b*) != SNil())) // at least one of the arguments is not Nil
     b.value match {
-      case LNil() => a.value
-      case bl @ LCons(x, xs) =>
+      case SNil() => a.value
+      case bl @ SCons(x, xs) =>
         a.value match {
-          case LNil() => bl
-          case LCons(y, ys) =>
+          case SNil() => bl
+          case SCons(y, ys) =>
             if (y < x)
-              LCons(y, $(merge(ys, b)))
+              SCons(y, $(merge(ys, b)))
             else
-              LCons(x, $(merge(a, xs)))
+              SCons(x, $(merge(a, xs)))
         }
     }
   } ensuring (res => ssize(a) + ssize(b) == res.size &&
-       time <= 300 * res.size - 100)
+       time <= 300 * res.size - 100) // note: res.size >= 1
 
-  // this will take O(n) time
+  /**
+   * Converts a list of integers to a list of streams of integers
+   */
   def IListToLList(l: IList): LList = {
     l match {
-      case INil() => SNil()
+      case INil() => LNil()
       case ICons(x, xs) =>
-        SCons(LCons(x, LNil()), IListToLList(xs))
+        LCons(SCons(x, SNil()), IListToLList(xs))
     }
   } ensuring (res => res.fullSize == l.size &&
     res.size == l.size &&
     res.valid &&
     time <= 11 * l.size + 3)
 
-  def mergeSort(l: IList): ILList = {
+  /**
+   * Takes list of integers and returns a sorted stream of integers.
+   * Takes time linear in the size of the  input since it sorts lazily.
+   */
+  def mergeSort(l: IList): IStream = {
     l match {
-      case INil() => LNil()
+      case INil() => SNil()
       case _ =>
         constructMergeTree(IListToLList(l)) match {
-          case SCons(r, SNil()) => r.value
+          case LCons(r, LNil()) => r.value
         }
     }
   } ensuring (res => time <= 400 * l.size + 10)
 
   /**
-   * Order statistics
+   * Order statistics.
+   * A function that accesses the first element of a list using lazy sorting.
    */
   def firstMin(l: IList) : BigInt ={
     require(l != INil())
     mergeSort(l) match {
-      case LCons(x, rest) => x
+      case SCons(x, rest) => x
     }
   } ensuring (res => time <= 400 * l.size + 20)
 }
diff --git a/testcases/lazy-datastructures/Conqueue-strategy1-with-uniqueness.scala b/testcases/lazy-datastructures/Conqueue-strategy1-with-uniqueness.scala
new file mode 100644
index 0000000000000000000000000000000000000000..d2f6521265f119a958883f5b57aa57975dfd222d
--- /dev/null
+++ b/testcases/lazy-datastructures/Conqueue-strategy1-with-uniqueness.scala
@@ -0,0 +1,493 @@
+import leon.lazyeval._
+import leon.lang._
+import leon.annotation._
+import leon.instrumentation._
+import leon.lang.synthesis._
+import ConcTrees._
+import ConQ._
+import Conqueue._
+
+object ConcTrees {
+  abstract class Conc[T] {
+    def isEmpty: Boolean = this == Empty[T]()
+
+    def level: BigInt = {
+      this match {
+        case Empty() =>
+          BigInt(0)
+        case Single(x) =>
+          BigInt(0)
+        case CC(l, r) =>
+          BigInt(1) + max(l.level, r.level)
+      }
+    } ensuring {
+      (x$1: BigInt) => x$1 >= BigInt(0)
+    }
+  }
+
+  case class Empty[T]() extends Conc[T]
+
+  case class Single[T](x: T) extends Conc[T]
+
+  case class CC[T](left: Conc[T], right: Conc[T]) extends Conc[T]
+
+  def max(x: BigInt, y: BigInt): BigInt = if (x >= y) {
+    x
+  } else {
+    y
+  }
+
+  def abs(x: BigInt): BigInt = if (x < BigInt(0)) {
+    -x
+  } else {
+    x
+  }
+}
+
+object Conqueue {
+  abstract class ConQ[T] {
+    def isSpine: Boolean = this match {
+      case Spine(_, _, _) =>
+        true
+      case _ =>
+        false
+    }
+    def isTip: Boolean = !this.isSpine
+  }
+
+  case class Tip[T](t: Conc[T]) extends ConQ[T]
+
+  case class Spine[T](head: Conc[T], createdWithSusp: Bool, rear: LazyConQ[T]) extends ConQ[T]
+
+  sealed abstract class Bool
+  case class True() extends Bool
+  case class False() extends Bool
+
+  abstract class Scheds[T]
+
+  case class Cons[T](h: LazyConQ[T], tail: Scheds[T]) extends Scheds[T]
+
+  case class Nil[T]() extends Scheds[T]
+
+  case class Wrapper[T](queue: LazyConQ[T], schedule: Scheds[T]) {
+    def valid(st: Set[LazyConQ[T]]): Boolean = zeroPreceedsLazy[T](this.queue, st) && schedulesProperty(this.queue, this.schedule, st)
+  }
+
+  def zeroPreceedsLazy[T](q: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    if (isEvaluated(q, st)) {
+      evalLazyConQS[T](q) match {
+        case Spine(Empty(), _, rear) => true
+        case Spine(_, _, rear) =>
+          zeroPreceedsLazy[T](rear, st)
+        case Tip(_) => true
+      }
+    } else false
+  }
+
+  def zeroPredLazyMonotone[T](st1: Set[LazyConQ[T]], st2: Set[LazyConQ[T]], q: LazyConQ[T]): Boolean = {
+    require(st1.subsetOf(st2) && zeroPreceedsLazy(q, st1))
+    zeroPreceedsLazy(q, st2) &&
+      //induction scheme
+      (evalLazyConQS[T](q) match {
+        case Spine(Empty(), _, _) =>
+          true
+        case Spine(h, _, rear) =>
+          zeroPredLazyMonotone(st1, st2, rear)
+        case Tip(_) =>
+          true
+      })
+  } holds
+
+  def weakZeroPreceedsLazy[T](q: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    evalLazyConQS[T](q) match {
+      case Spine(Empty(), _, rear10) =>
+        true
+      case Spine(h, _, rear11) =>
+        zeroPreceedsLazy[T](rear11, st)
+      case Tip(_) =>
+        true
+    }
+  }
+
+  def firstUnevaluated[T](l: LazyConQ[T], st: Set[LazyConQ[T]]): LazyConQ[T] = {
+    if (isEvaluated(l, st)) {
+      evalLazyConQS[T](l) match {
+        case Spine(_, _, tail15) =>
+          firstUnevaluated[T](tail15, st)
+        case _ =>
+          l
+      }
+    } else {
+      l
+    }
+  } ensuring {
+    (res65: LazyConQ[T]) =>
+      (evalLazyConQS[T](res65).isTip || !st.contains(res65)) /*&&
+        {
+          val dres4 = evalLazyConQ[T](res65, st)
+          dres4._1 match {
+            case Spine(_, _, tail16) =>
+              firstUnevaluated[T](l, dres4._2) == firstUnevaluated[T](tail16, dres4._2)
+            case _ =>
+              true
+          }
+        }*/
+  }
+
+  def nextUnevaluatedLemma[T](l: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    val (res, nst) = evalLazyConQ[T](firstUnevaluated(l, st), st)
+    (res match {
+      case Spine(_, _, tail) =>
+        firstUnevaluated[T](l, nst) == firstUnevaluated[T](tail, nst)
+      case _ =>
+        true
+    }) &&
+      // induction scheme
+      (evalLazyConQS[T](l) match {
+        case Spine(_, _, tail) =>
+          nextUnevaluatedLemma(tail, st)
+        case _ => true
+      })
+  } holds
+
+  def schedulesProperty[T](q: LazyConQ[T], schs: Scheds[T], st: Set[LazyConQ[T]]): Boolean = {
+    val x = firstUnevaluated(q, st)
+    val y = evalLazyConQS(x)
+    schs match {
+      case Cons(head, tail) =>
+        y.isSpine && x == head && schedulesProperty[T](pushUntilZero[T](head), tail, st) &&
+          weakZeroPreceedsLazy(head, st)
+      case Nil() =>
+        y.isTip
+    }
+  }
+
+  def pushUntilZero[T](q: LazyConQ[T]): LazyConQ[T] = evalLazyConQS[T](q) match {
+    case Spine(Empty(), _, rear12) =>
+      pushUntilZero[T](rear12)
+    case Spine(h, _, rear13) =>
+      rear13
+    case Tip(_) =>
+      q
+  }
+
+  def pushLeft[T](ys: Single[T], xs: LazyConQ[T], st: Set[LazyConQ[T]]): (ConQ[T], Set[LazyConQ[T]]) = {
+    require(zeroPreceedsLazy[T](xs, st) && ys.isInstanceOf[Single[T]])
+    val dres5 = evalLazyConQ[T](xs, st)
+    dres5._1 match {
+      case Tip(CC(_, _)) =>
+        (Spine[T](ys, False(), xs), dres5._2)
+      case Tip(Empty()) =>
+        (Tip[T](ys), dres5._2)
+      case Tip(t @ Single(_)) =>
+        (Tip[T](CC[T](ys, t)), dres5._2)
+      case s @ Spine(Empty(), _, rear) =>
+        (Spine[T](ys, False(), rear), dres5._2) // in this case firstUnevaluated[T](rear, st) == firstUnevaluated[T](xs, st)
+      case s @ Spine(_, _, _) =>
+        pushLeftLazy[T](ys, xs, dres5._2)
+    }
+  } ensuring { res =>
+    true
+  }
+
+  def dummyFun[T]() = ???[(ConQ[T], Set[LazyConQ[T]])]
+
+  @library
+  def pushLeftLazyUI[T](ys: Conc[T], xs: LazyConQ[T], st: Set[LazyConQ[T]]): (ConQ[T], Set[LazyConQ[T]]) = {
+    dummyFun()
+  } ensuring (res => res._2 == st && (res._1 match {
+    case Spine(Empty(), createdWithSusp, rear) =>
+      val (rearval, _) = evalLazyConQ[T](rear, st) // this is necessary to assert properties on the state in the recursive invocation
+      val p = pushUntilZero[T](rear)
+      val fr = firstUnevaluated[T](p, st)
+      rearval.isSpine && {
+        if (createdWithSusp == False()) {
+          fr == firstUnevaluated[T](rear, st)
+        } else
+          !isEvaluated(rear, st)
+      } &&
+        {
+          val f = firstUnevaluated[T](xs, st)
+          ((evalLazyConQS[T](f).isTip &&
+            evalLazyConQS[T](fr).isTip) || fr == f)
+        } &&
+        weakZeroPreceedsLazy(rear, st)
+    case _ => false
+  }))
+
+  def pushLeftLazy[T](ys: Conc[T], xs: LazyConQ[T], st: Set[LazyConQ[T]]): (ConQ[T], Set[LazyConQ[T]]) = {
+    require(!ys.isEmpty && zeroPreceedsLazy[T](xs, st) &&
+      (evalLazyConQS[T](xs) match {
+        case Spine(h, _, _) => h != Empty[T]()
+        case _ => false
+      }))
+    val dres = evalLazyConQ[T](xs, st)
+    dres._1 match {
+      case Spine(head, _, rear15) =>
+        val carry = CC[T](head, ys)
+        val dres2 = evalLazyConQ[T](rear15, dres._2)
+        dres2._1 match {
+          case s @ Spine(Empty(), _, _) =>
+            (Spine[T](Empty[T](), False(), Eager(Spine(carry, False(), rear15))), dres2._2)
+          case s @ Spine(_, _, _) =>
+            (Spine[T](Empty[T](), True(), newConQ[T](PushLeftLazy[T](carry, rear15), dres2._2)), dres2._2)
+          case t @ Tip(tree) =>
+            if (tree.level > carry.level) { // can this happen ? this means tree is of level at least two greater than rear ?
+              val x: ConQ[T] = t
+              (Spine[T](Empty[T](), False(), Eager(Spine(carry, False(), rear15))), dres2._2)
+            } else { // here tree level and carry level are equal
+              val x: ConQ[T] = Tip[T](CC[T](tree, carry))
+              (Spine[T](Empty[T](), False(), Eager(Spine(Empty[T](), False(), Eager[T](x)))), dres2._2)
+            }
+        }
+    }
+  } ensuring {
+    (res66: (Spine[T], Set[LazyConQ[T]])) =>
+      (res66._2 == st) && (res66._1 match {
+        case Spine(Empty(), createdWithSusp, rear) =>
+          val (rearval, _) = evalLazyConQ[T](rear, st) // this is necessary to assert properties on the state in the recursive invocation
+          val p = pushUntilZero[T](rear)
+          val fr = firstUnevaluated[T](p, st)
+          rearval.isSpine && {
+            if (createdWithSusp == False()) {
+              fr == firstUnevaluated[T](rear, st)
+            } else
+              !isEvaluated(rear, st)
+          } &&
+            {
+              val f = firstUnevaluated[T](xs, st)
+              ((evalLazyConQS[T](f).isTip &&
+                evalLazyConQS[T](fr).isTip) || fr == f)
+            } &&
+            weakZeroPreceedsLazy(rear, st)
+        case _ =>
+          false
+      })
+  }
+
+  def pushLeftWrapper[T](ys: Single[T], w: Wrapper[T], st: Set[LazyConQ[T]]) = {
+    require(w.valid(st) && ys.isInstanceOf[Single[T]])
+    val (nq, nst) = pushLeft[T](ys, w.queue, st)
+    val nsched = nq match {
+      case Spine(Empty(), createdWithSusp, rear18) if createdWithSusp == True() =>
+        Cons[T](rear18, w.schedule) // this is the only case where we create a new lazy closure
+      case _ =>
+        w.schedule
+    }
+    (Eager(nq), nsched, nst)
+  } ensuring { res => schedulesProperty(res._1, res._2, res._3) }
+
+  /*def PushLeftLazypushLeftLazyLem[T](rear15 : LazyConQ[T], head : Conc[T], dres : (ConQ[T], Set[LazyConQ[T]]), st : Set[LazyConQ[T]], xs : LazyConQ[T], s : Spine[T], dres : (ConQ[T], Set[LazyConQ[T]]), carry : CC[T], ys : Conc[T]): Boolean =  {
+    (!ys.isEmpty && zeroPreceedsLazy[T](xs, st) && evalLazyConQS[T](xs).isSpine && dres == evalLazyConQ[T](xs, st) && (!dres._1.isInstanceOf[Spine[T]] || !dres._1.head.isInstanceOf[Empty[T]]) && dres._1.isInstanceOf[Spine[T]] && head == dres._1.head && rear15 == dres._1.rear && carry == CC[T](head, ys) && dres == evalLazyConQ[T](rear15, dres._2) && (!dres._1.isInstanceOf[Spine[T]] || !dres._1.head.isInstanceOf[Empty[T]]) && dres._1.isInstanceOf[Spine[T]] && s == dres._1) ==> (!carry.isEmpty && zeroPreceedsLazy[T](rear15, dres._2) && evalLazyConQS[T](rear15).isSpine)
+  } ensuring {
+    (holds : Boolean) => holds
+  }*/
+
+  def streamContains[T](l: LazyConQ[T], newl: LazyConQ[T]): Boolean = {
+    (l == newl) || (evalLazyConQS[T](l) match {
+      case Spine(_, _, tail) =>
+        streamContains(tail, newl)
+      case _ => false
+    })
+  }
+
+  // monotonicity of fune
+  def funeMonotone[T](st1: Set[LazyConQ[T]], st2: Set[LazyConQ[T]], l: LazyConQ[T], newl: LazyConQ[T]): Boolean = {
+    require(st2 == st1 ++ Set(newl) &&
+      !streamContains(l, newl))
+    (firstUnevaluated(l, st1) == firstUnevaluated(l, st2)) && //property
+      //induction scheme
+      (evalLazyConQS[T](l) match {
+        case Spine(_, _, tail) =>
+          funeMonotone(st1, st2, tail, newl)
+        case _ =>
+          true
+      })
+  } holds
+
+  @library // To be proven
+  def schedMonotone[T](st1: Set[LazyConQ[T]], st2: Set[LazyConQ[T]], scheds: Scheds[T], l: LazyConQ[T], newl: LazyConQ[T]): Boolean = {
+    require((st2 == st1 ++ Set(newl)) &&
+      !streamContains(l, newl) && // newl is not contained in 'l'
+      schedulesProperty(l, scheds, st1))
+
+      funeMonotone(st1, st2, l, newl) && //instantiations
+      schedulesProperty(l, scheds, st2) && //property
+      //induction scheme
+      (scheds match {
+        case Cons(head, tail) =>
+          (evalLazyConQS[T](head) match {
+            case Spine(h, _, rear11) if h != Empty[T]()=>
+              zeroPredLazyMonotone(st1, st2, rear11)
+            case _ => true
+          }) &&
+          schedMonotone(st1, st2, tail, pushUntilZero(head), newl)
+        case Nil() => true
+      })
+  } holds
+
+  @library
+  def dummyAxiom[T](l: LazyConQ[T], nl: LazyConQ[T]): Boolean = {
+    !streamContains(l, nl)
+  } holds
+
+  def funeCompose[T](st1: Set[LazyConQ[T]], st2: Set[LazyConQ[T]], q: LazyConQ[T]): Boolean = {
+    require(st1.subsetOf(st2))
+    (firstUnevaluated(firstUnevaluated(q, st1), st2) == firstUnevaluated(q, st2)) && //property
+      //induction scheme
+      (evalLazyConQS[T](q) match {
+        case Spine(_, _, tail) =>
+          funeCompose(st1, st2, tail)
+        case _ =>
+          true
+      })
+  } holds
+
+  // induction following the structure of zeroPredLazy
+  def funeZeroProp[T](st1: Set[LazyConQ[T]], st2: Set[LazyConQ[T]], q: LazyConQ[T]): Boolean = {
+    require(st1.subsetOf(st2) && {
+      val x = firstUnevaluated(q, st1)
+      st2.contains(x) && weakZeroPreceedsLazy(x, st1)
+    })
+    zeroPreceedsLazy(q, st2) && //property
+      //induction scheme
+      (evalLazyConQS[T](q) match {
+        case Spine(Empty(), _, tail) =>
+          true
+        case Spine(_, _, tail) =>
+          (if (isEvaluated(q, st1))
+            funeZeroProp(st1, st2, tail)
+          else true)
+        case _ =>
+          true
+      })
+  } holds
+
+  // induction following the structure of zeroPredLazy
+  def funeZeroProp2[T](st: Set[LazyConQ[T]], q: LazyConQ[T]): Boolean = {
+    require(evalLazyConQS(firstUnevaluated(q, st)).isTip)
+    zeroPreceedsLazy(q, st) && //property
+      //induction scheme
+      (evalLazyConQS[T](q) match {
+        case Spine(_, _, tail) =>
+          funeZeroProp2(st, tail)
+        case _ =>
+          true
+      })
+  } holds
+
+  def Pay[T](q: LazyConQ[T], scheds: Scheds[T], st: Set[LazyConQ[T]]): (Scheds[T], Set[LazyConQ[T]]) = {
+    require(schedulesProperty(q, scheds, st) && isEvaluated(q, st))
+    val (nschs, rst) = scheds match {
+      case c @ Cons(head, rest) =>
+        val (headval, st2) = evalLazyConQ(head, st)
+        (headval match {
+          case Spine(Empty(), createdWithSusp, rear) =>
+            if (createdWithSusp == True()) {
+              Cons(rear, rest)
+            } else
+              rest
+//            In this case,
+//              val p = pushUntilZero(rear)
+//            	firstUnevaluated(q, res._2) == firstUnevaluated(rear, res._2) &&
+//            	firstUnevaluated(p, st) == rhead &&
+//            	rhead == firstUnevaluated(rear, st) &&
+          case _ =>
+            rest
+          // Note: head has to be spine since scheds is not empty
+          // in this case: firstUnevaluated[T](headval.rear, st) == rhead  && firstUnevaluated[T](q, res._2) == rhead by funeCompose
+            //firstUnevaluated(rear, st) == rhead &&
+          //schedulesProperty(pushUntilZero(head), res._1, st) &&
+          //schedulesProperty(pushUntilZero(rhead), rtail, st)
+          // schedMonotone(st, res._2, rtail, pushUntilZero(rhead), head)
+        }, st2)
+      case Nil() =>
+        (scheds, st)
+    }
+    (nschs, rst)
+  } ensuring { res =>
+      zeroPreceedsLazy(q, res._2) &&
+      schedulesProperty(q, res._1, res._2) &&
+      // instantiations (relating rhead and head)
+      funeCompose(st, res._2, q) &&
+      (scheds match {
+        case Cons(head, rest) =>
+          (res._1 match {
+            case Cons(rhead, rtail) =>
+              val p = pushUntilZero(rhead)
+              dummyAxiom(p, head) &&
+                schedMonotone(st, res._2, rtail, p, head) &&
+              {
+                // an instantiation that could be packed into schedMonotone
+                evalLazyConQS(rhead) match {
+                  case Spine(h, _, rear11) if h != Empty[T]()=>
+                  	zeroPredLazyMonotone(st, res._2, rear11)
+                  case _ => true
+                }
+              }
+            case _ => true
+          }) &&
+            (evalLazyConQS(head) match {
+              case Spine(Empty(), cws, rear) =>
+                dummyAxiom(rear, head) &&
+                funeMonotone(st, res._2, rear, head) &&
+                nextUnevaluatedLemma(q, st)
+              case _ => true
+            })
+        case _ => true
+      }) &&
+      // instantiations for proving zeroPreceedsLazy property
+      (scheds match {
+        case Cons(head, rest) =>
+          // establishing the zeroPreceedsLazy Property (on this case)
+          //fune(q, st) == head && weakZero(head, st) && res._2 == st ++ { head }
+          funeZeroProp(st, res._2, q) //instantiation
+        case _ =>
+          funeZeroProp2(st, q)
+      })
+  }
+
+  def pushLeftAndPay[T](ys: Single[T], w: Wrapper[T], st: Set[LazyConQ[T]]) = {
+    require(w.valid(st) && ys.isInstanceOf[Single[T]])
+    val (q, scheds, nst) = pushLeftWrapper(ys, w, st)
+    val (nscheds, fst) = Pay(q, scheds, nst)
+    (Wrapper(q, nscheds), fst)
+  } ensuring { res => res._1.valid(res._2) }
+
+  def lazyarg1[T](x: ConQ[T]): ConQ[T] = x
+}
+
+object ConQ {
+
+  abstract class LazyConQ[T1]
+
+  case class Eager[T](x: ConQ[T]) extends LazyConQ[T]
+
+  case class PushLeftLazy[T](ys: Conc[T], xs: LazyConQ[T]) extends LazyConQ[T]
+
+  @library
+  def newConQ[T1](cc: LazyConQ[T1], st: Set[LazyConQ[T1]]): LazyConQ[T1] = {
+    cc
+  } ensuring {
+    (res: LazyConQ[T1]) => !st.contains(res)
+  }
+
+  @library
+  def evalLazyConQ[T](cl: LazyConQ[T], st: Set[LazyConQ[T]]): (ConQ[T], Set[LazyConQ[T]]) = {
+    cl match {
+      case t: Eager[T] =>
+        (t.x, st)
+      case t: PushLeftLazy[T] =>
+        val plres = pushLeftLazyUI[T](t.ys, t.xs, uiState())._1
+        val plst = pushLeftLazyUI[T](t.ys, t.xs, st)._2
+        (plres, (plst ++ Set[LazyConQ[T]](t)))
+    }
+  }
+
+  def isEvaluated[T](cl: LazyConQ[T], st: Set[LazyConQ[T]]) = st.contains(cl) || cl.isInstanceOf[Eager[T]]
+
+  def uiState[T](): Set[LazyConQ[T]] = ???[Set[LazyConQ[T]]]
+
+  def evalLazyConQS[T](cl: LazyConQ[T]): ConQ[T] = evalLazyConQ[T](cl, uiState())._1
+
+}
diff --git a/testcases/lazy-datastructures/Conqueue-strategy2.scala b/testcases/lazy-datastructures/Conqueue-strategy2.scala
new file mode 100644
index 0000000000000000000000000000000000000000..ab1e7d0cba96fc7408432fdc580ce65e66dc22dd
--- /dev/null
+++ b/testcases/lazy-datastructures/Conqueue-strategy2.scala
@@ -0,0 +1,701 @@
+import leon.lazyeval._
+import leon.lang._
+import leon.annotation._
+import leon.instrumentation._
+import leon.lang.synthesis._
+import ConcTrees._
+import ConQ._
+import Conqueue._
+
+object ConcTrees {
+  abstract class Conc[T] {
+    def isEmpty: Boolean = this == Empty[T]()
+
+    def level: BigInt = {
+      this match {
+        case Empty() =>
+          BigInt(0)
+        case Single(x) =>
+          BigInt(0)
+        case CC(l, r) =>
+          BigInt(1) + max(l.level, r.level)
+      }
+    } ensuring {
+      (x$1: BigInt) => x$1 >= BigInt(0)
+    }
+  }
+
+  case class Empty[T]() extends Conc[T]
+
+  case class Single[T](x: T) extends Conc[T]
+
+  case class CC[T](left: Conc[T], right: Conc[T]) extends Conc[T]
+
+  def max(x: BigInt, y: BigInt): BigInt = if (x >= y) {
+    x
+  } else {
+    y
+  }
+
+  def abs(x: BigInt): BigInt = if (x < BigInt(0)) {
+    -x
+  } else {
+    x
+  }
+}
+
+object Conqueue {
+  abstract class ConQ[T] {
+    def isSpine: Boolean = this match {
+      case Spine(_, _, _) =>
+        true
+      case _ =>
+        false
+    }
+    def isTip: Boolean = !this.isSpine
+  }
+
+  case class Tip[T](t: Conc[T]) extends ConQ[T]
+
+  case class Spine[T](head: Conc[T], createdWithSusp: Bool, rear: LazyConQ[T]) extends ConQ[T]
+
+  sealed abstract class Bool
+  case class True() extends Bool
+  case class False() extends Bool
+
+  abstract class Scheds[T]
+
+  case class Cons[T](h: LazyConQ[T], tail: Scheds[T]) extends Scheds[T]
+
+  case class Nil[T]() extends Scheds[T]
+
+  case class Wrapper[T](queue: LazyConQ[T], schedule: Scheds[T]) {
+    def valid(st: Set[LazyConQ[T]]): Boolean = zeroPreceedsLazy[T](this.queue, st) &&
+      // schedulesProperty(this.queue, this.schedule, st)
+      strongSchedsProp(this.queue, this.schedule, 2, st) // head of the schedule should start after the first two elements
+  }
+
+  def zeroPreceedsLazy[T](q: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    if (isEvaluated(q, st)) {
+      evalLazyConQS[T](q) match {
+        case Spine(Empty(), _, rear) => true
+        case Spine(_, _, rear) =>
+          zeroPreceedsLazy[T](rear, st)
+        case Tip(_) => true
+      }
+    } else false
+  }
+
+  // not used, but still interesting
+  def zeroPredLazyMonotone[T](st1: Set[LazyConQ[T]], st2: Set[LazyConQ[T]], q: LazyConQ[T]): Boolean = {
+    require(st1.subsetOf(st2) && zeroPreceedsLazy(q, st1))
+    zeroPreceedsLazy(q, st2) &&
+      //induction scheme
+      (evalLazyConQS[T](q) match {
+        case Spine(Empty(), _, _) =>
+          true
+        case Spine(h, _, rear) =>
+          zeroPredLazyMonotone(st1, st2, rear)
+        case Tip(_) =>
+          true
+      })
+  } holds
+
+  def weakZeroPreceedsLazy[T](q: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    evalLazyConQS[T](q) match {
+      case Spine(Empty(), _, rear10) =>
+        true
+      case Spine(h, _, rear11) =>
+        zeroPreceedsLazy[T](rear11, st)
+      case Tip(_) =>
+        true
+    }
+  }
+
+  def firstUnevaluated[T](l: LazyConQ[T], st: Set[LazyConQ[T]]): LazyConQ[T] = {
+    if (isEvaluated(l, st)) {
+      evalLazyConQS[T](l) match {
+        case Spine(_, _, tail15) =>
+          firstUnevaluated[T](tail15, st)
+        case _ =>
+          l
+      }
+    } else {
+      l
+    }
+  } ensuring {
+    (res65: LazyConQ[T]) =>
+      (evalLazyConQS[T](res65).isTip || !st.contains(res65)) &&
+        concreteUntil(l, res65, st)
+  }
+
+  def nextUnevaluatedLemma[T](l: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    val (res, nst) = evalLazyConQ[T](firstUnevaluated(l, st), st)
+    (res match {
+      case Spine(_, _, tail) =>
+        firstUnevaluated[T](l, nst) == firstUnevaluated[T](tail, nst)
+      case _ =>
+        true
+    }) &&
+      // induction scheme
+      (evalLazyConQS[T](l) match {
+        case Spine(_, _, tail) =>
+          nextUnevaluatedLemma(tail, st)
+        case _ => true
+      })
+  } holds
+
+  /**
+   * Everything until suf is evaluated
+   */
+  def concreteUntil[T](l: LazyConQ[T], suf: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    if (l != suf) {
+      isEvaluated(l, st) && (evalLazyConQS[T](l) match {
+        case Spine(_, cws, tail15) =>
+          concreteUntil(tail15, suf, st)
+        case _ =>
+          true
+      })
+    } else true
+  }
+
+  def isConcrete[T](l: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = isEvaluated(l, st) && (evalLazyConQS[T](l) match {
+    case Spine(_, _, tail13) =>
+      isConcrete[T](tail13, st)
+    case _ =>
+      true
+  })
+
+  def schedulesProperty[T](q: LazyConQ[T], schs: Scheds[T], st: Set[LazyConQ[T]]): Boolean = {
+    schs match {
+      case Cons(head, tail) =>
+        evalLazyConQS[T](head) match {
+          case Spine(Empty(), _, _) =>
+            !head.isInstanceOf[Eager[T]] &&
+              concreteUntil(q, head, st) && schedulesProperty[T](pushUntilZero[T](head), tail, st)
+          case _ =>
+            false
+        }
+      case Nil() =>
+        isConcrete(q, st)
+    }
+  }
+
+  /*def strongSchedsProp[T](q: LazyConQ[T], schs: Scheds[T], st: Set[LazyConQ[T]]) = {
+    isEvaluated(q, st) && {
+      val l = evalLazyConQS[T](q) match {
+        case Spine(_, _, rear) =>
+          rear
+        case _ => q
+      }
+      schedulesProperty(l, schs, st)
+    }
+  }*/
+
+  /**
+   * Goes at most two steps
+   */
+  def nthTail[T](q: LazyConQ[T], n: BigInt): LazyConQ[T] = {
+    evalLazyConQS[T](q) match {
+      case Spine(_, _, rear) if n > 1 =>
+        evalLazyConQS[T](rear) match {
+          case Spine(_, _, srear) => srear
+          case _                  => rear
+        }
+      case Spine(_, _, rear) if n == 1 =>
+        rear
+      case _ =>
+        q
+    }
+  }
+
+  def strongSchedsProp[T](q: LazyConQ[T], schs: Scheds[T], sufOffset : BigInt, st: Set[LazyConQ[T]]) : Boolean = {
+    isEvaluated(q, st) && {
+      evalLazyConQS[T](q) match {
+        case Spine(_, _, rear) if sufOffset > 1 =>
+          isEvaluated(rear, st)
+        case _ => true
+      }
+    } &&
+    schedulesProperty(nthTail(q, sufOffset), schs, st)
+  }
+
+  // pushes a carry until there is a one
+  // TODO: rename this to pushUntilCarry
+  def pushUntilZero[T](q: LazyConQ[T]): LazyConQ[T] = evalLazyConQS[T](q) match {
+    case Spine(Empty(), _, rear12) =>
+      pushUntilZero[T](rear12)
+    case Spine(h, _, rear13) =>
+      rear13
+    case Tip(_) =>
+      q
+  }
+
+  def pushLeft[T](ys: Single[T], xs: LazyConQ[T], st: Set[LazyConQ[T]]): (ConQ[T], Set[LazyConQ[T]]) = {
+    require(zeroPreceedsLazy[T](xs, st) && ys.isInstanceOf[Single[T]])
+
+    val dres5 = evalLazyConQ[T](xs, st)
+    dres5._1 match {
+      case Tip(CC(_, _)) =>
+        (Spine[T](ys, False(), xs), dres5._2)
+      case Tip(Empty()) =>
+        (Tip[T](ys), dres5._2)
+      case Tip(t @ Single(_)) =>
+        (Tip[T](CC[T](ys, t)), dres5._2)
+      case s @ Spine(Empty(), _, rear) =>
+        (Spine[T](ys, False(), rear), dres5._2) // in this case firstUnevaluated[T](rear, st) == firstUnevaluated[T](xs, st)
+      case s @ Spine(_, _, _) =>
+        pushLeftLazy[T](ys, xs, dres5._2)
+    }
+  }
+
+/*  def dummyFun[T]() = ???[(ConQ[T], Set[LazyConQ[T]])]
+
+  @library
+  def pushLeftLazyUI[T](ys: Conc[T], xs: LazyConQ[T], st: Set[LazyConQ[T]]): (ConQ[T], Set[LazyConQ[T]]) = {
+    dummyFun()
+  } ensuring (res => res._2 == st && (res._1 match {
+    case Spine(Empty(), createdWithSusp, rear) =>
+      true
+    case _ => false
+  }))*/
+
+  def pushLeftLazyVal[T](ys: Conc[T], xs: LazyConQ[T]) : ConQ[T] = ???[ConQ[T]]
+
+  @library
+  def resSpec[T](ys: Conc[T], xs: LazyConQ[T], res : ConQ[T]) = {
+    res == pushLeftLazyVal(ys, xs)
+  } holds
+
+  def pushLeftLazy[T](ys: Conc[T], xs: LazyConQ[T], st: Set[LazyConQ[T]]): (ConQ[T], Set[LazyConQ[T]]) = {
+    require(!ys.isEmpty && zeroPreceedsLazy[T](xs, st) &&
+      (evalLazyConQS(xs) match {
+        case Spine(h, _, _) => h != Empty[T]()
+        case _              => false
+      }))
+    val dres = evalLazyConQ[T](xs, st)
+    val res = dres._1 match {
+      case Spine(head, _, rear15) =>
+        val carry = CC[T](head, ys)
+        val dres2 = evalLazyConQ[T](rear15, dres._2)
+        dres2._1 match {
+          case s @ Spine(Empty(), _, srear) =>
+            (Spine[T](Empty[T](), False(), Eager(Spine(carry, False(), srear))), dres2._2)
+          case s @ Spine(_, _, _) =>
+            (Spine[T](Empty[T](), True(), PushLeftLazy[T](carry, rear15)), dres2._2)
+          case t @ Tip(tree) =>
+            if (tree.level > carry.level) { // this case may not even be possible given additional preconditions
+              val x: ConQ[T] = t
+              (Spine[T](Empty[T](), False(), Eager(Spine(carry, False(), rear15))), dres2._2)
+            } else { // here tree level and carry level are equal
+              val x: ConQ[T] = Tip[T](CC[T](tree, carry))
+              (Spine[T](Empty[T](), False(), Eager(Spine(Empty[T](), False(), Eager[T](x)))), dres2._2)
+            }
+        }
+    }
+    res
+  } ensuring {
+    (res66: (Spine[T], Set[LazyConQ[T]])) =>
+      resSpec(ys, xs, res66._1) &&  // asserts that the result is a function of inputs excluding state
+      (res66._2 == st) && (res66._1 match {
+        case Spine(Empty(), createdWithSusp, rear) =>
+          val (rearval, _) = evalLazyConQ[T](rear, st) // this is necessary to assert properties on the state in the recursive invocation
+          (!isConcrete(xs, st) || isConcrete(pushUntilZero(rear), st))
+          //true
+        case _ =>
+          false
+      })
+  }
+
+  def pushLeftLazyLemma[T](ys: Conc[T], xs: LazyConQ[T], suf: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    require(!ys.isEmpty && zeroPreceedsLazy[T](xs, st) &&
+      (evalLazyConQS(xs) match {
+        case Spine(h, _, _) => h != Empty[T]() // this implie xs.rear is also evaluated
+        case _              => false
+      }) &&
+      (evalLazyConQS(suf) match {
+        case Spine(Empty(), _, _) =>
+          concreteUntil(nthTail(xs, 2), suf, st)
+        case _ => false
+      }))
+    // property
+    (pushLeftLazy(ys, xs, st)._1 match {
+      case Spine(Empty(), createdWithSusp, rear) =>
+        // forall suf. suf*.head == Empty() ^ concUntil(xs, suf, st) => concUntil(push(rear), suf, st)
+        val p = pushUntilZero[T](rear)
+        concreteUntil(p, suf, st)
+    }) &&
+      // induction scheme
+      (evalLazyConQS(xs) match {
+        case Spine(head, _, rear15) =>
+          val carry = CC[T](head, ys)
+          evalLazyConQS(rear15) match {
+            case s @ Spine(h, _, _) if h != Empty[T]() =>
+              pushLeftLazyLemma(carry, rear15, suf, st)
+            case _ => true
+          }
+      })
+  } holds
+
+  def pushLeftWrapper[T](ys: Single[T], w: Wrapper[T], st: Set[LazyConQ[T]]) = {
+    require(w.valid(st) && ys.isInstanceOf[Single[T]])
+    val (nq, nst) = pushLeft[T](ys, w.queue, st)
+    val nsched = nq match {
+      case Spine(Empty(), createdWithSusp, rear18) if createdWithSusp == True() =>
+        Cons[T](rear18, w.schedule) // this is the only case where we create a new lazy closure
+      case _ =>
+        w.schedule
+    }
+    (Eager(nq), nsched, nst)
+  } ensuring { res =>
+    strongSchedsProp(res._1, res._2, 1, res._3) && // head of the schedule may start after the first element
+      // lemma instantiations
+      (w.schedule match {
+        case Cons(head, tail) =>
+          evalLazyConQS(w.queue) match {
+            case Spine(h, _, _) if h != Empty[T]() =>
+              pushLeftLazyLemma(ys, w.queue, head, st)
+            case _ => true
+          }
+        case _ => true
+      })
+  }
+
+  /*def PushLeftLazypushLeftLazyLem[T](rear15 : LazyConQ[T], head : Conc[T], dres : (ConQ[T], Set[LazyConQ[T]]), st : Set[LazyConQ[T]], xs : LazyConQ[T], s : Spine[T], dres : (ConQ[T], Set[LazyConQ[T]]), carry : CC[T], ys : Conc[T]): Boolean =  {
+    (!ys.isEmpty && zeroPreceedsLazy[T](xs, st) && evalLazyConQS[T](xs).isSpine && dres == evalLazyConQ[T](xs, st) && (!dres._1.isInstanceOf[Spine[T]] || !dres._1.head.isInstanceOf[Empty[T]]) && dres._1.isInstanceOf[Spine[T]] && head == dres._1.head && rear15 == dres._1.rear && carry == CC[T](head, ys) && dres == evalLazyConQ[T](rear15, dres._2) && (!dres._1.isInstanceOf[Spine[T]] || !dres._1.head.isInstanceOf[Empty[T]]) && dres._1.isInstanceOf[Spine[T]] && s == dres._1) ==> (!carry.isEmpty && zeroPreceedsLazy[T](rear15, dres._2) && evalLazyConQS[T](rear15).isSpine)
+  } ensuring {
+    (holds : Boolean) => holds
+  }*/
+
+  def schedMonotone[T](st1: Set[LazyConQ[T]], st2: Set[LazyConQ[T]], scheds: Scheds[T], l: LazyConQ[T]): Boolean = {
+    require(st1.subsetOf(st2) && schedulesProperty(l, scheds, st1))
+    schedulesProperty(l, scheds, st2) && //property
+      //induction scheme
+      (scheds match {
+        case Cons(head, tail) =>
+          evalLazyConQS[T](head) match {
+            case Spine(_, _, rear) =>
+              concUntilMonotone(l, head, st1, st2) &&
+                schedMonotone(st1, st2, tail, pushUntilZero(head))
+            case _ => true
+          }
+        case Nil() =>
+          concreteMonotone(st1, st2, l)
+      })
+  } holds
+
+  def concreteMonotone[T](st1: Set[LazyConQ[T]], st2: Set[LazyConQ[T]], l: LazyConQ[T]): Boolean = {
+    require(isConcrete(l, st1) && st1.subsetOf(st2))
+    isConcrete(l, st2) && {
+      // induction scheme
+      evalLazyConQS[T](l) match {
+        case Spine(_, _, tail) =>
+          concreteMonotone[T](st1, st2, tail)
+        case _ =>
+          true
+      }
+    }
+  } holds
+
+  def concUntilExtenLemma[T](q: LazyConQ[T], suf: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    require(concreteUntil(q, suf, st))
+    val (next, nst) = evalLazyConQ[T](suf, st)
+    (next match {
+      case Spine(_, _, rear) =>
+        concreteUntil(q, rear, nst)
+      case _ => true
+    }) &&
+      (if (q != suf) {
+        evalLazyConQS[T](q) match {
+          case Spine(_, _, tail) =>
+            concUntilExtenLemma(tail, suf, st)
+          case _ =>
+            true
+        }
+      } else true)
+  } holds
+
+  def concUntilExtenLemma2[T](q: LazyConQ[T], suf: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    require(concreteUntil(q, suf, st) && isConcrete(suf, st))
+    isConcrete(q, st) &&
+      (if (q != suf) {
+        evalLazyConQS[T](q) match {
+          case Spine(_, _, tail) =>
+            concUntilExtenLemma2(tail, suf, st)
+          case _ =>
+            true
+        }
+      } else true)
+  } ensuring (_ == true)
+
+  def concUntilCompose[T](q: LazyConQ[T], suf1: LazyConQ[T], suf2: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    require(concreteUntil(q, suf1, st) && concreteUntil(suf1, suf2, st))
+    concreteUntil(q, suf2, st) &&
+      (if (q != suf1) {
+        evalLazyConQS[T](q) match {
+          case Spine(_, _, tail) =>
+            concUntilCompose(tail, suf1, suf2, st)
+          case _ =>
+            true
+        }
+      } else true)
+  } ensuring (_ == true)
+
+  def concUntilMonotone[T](q: LazyConQ[T], suf: LazyConQ[T], st1: Set[LazyConQ[T]], st2: Set[LazyConQ[T]]): Boolean = {
+    require(concreteUntil(q, suf, st1) && st1.subsetOf(st2))
+    concreteUntil(q, suf, st2) &&
+      (if (q != suf) {
+        evalLazyConQS[T](q) match {
+          case Spine(_, _, tail) =>
+            concUntilMonotone(tail, suf, st1, st2)
+          case _ =>
+            true
+        }
+      } else true)
+  } ensuring (_ == true)
+
+  def concUntilZeroPredLemma[T](q: LazyConQ[T], suf: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    require(concreteUntil(q, suf, st) && (evalLazyConQS(suf) match {
+      case Spine(Empty(), _, _) => true
+      case _                    => false
+    }))
+    val (next, nst) = evalLazyConQ[T](suf, st)
+    zeroPreceedsLazy(q, nst) &&
+      (if (q != suf) {
+        evalLazyConQS[T](q) match {
+          case Spine(_, _, tail) =>
+            concUntilZeroPredLemma(tail, suf, st)
+          case _ =>
+            true
+        }
+      } else true)
+  } holds
+
+  def concreteZeroPredLemma[T](q: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    require(isConcrete(q, st))
+    zeroPreceedsLazy(q, st) && {
+      // induction scheme
+      evalLazyConQS[T](q) match {
+        case Spine(_, _, tail) =>
+          concreteZeroPredLemma[T](tail, st)
+        case _ =>
+          true
+      }
+    }
+  } holds
+
+  def zeroPreceedsSuf[T](q: LazyConQ[T], suf: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    if (q != suf) {
+      evalLazyConQS[T](q) match {
+        case Spine(Empty(), _, rear) => true
+        case Spine(_, _, rear) =>
+          zeroPreceedsSuf(rear, suf, st)
+        case Tip(_) => true
+      }
+    } else false
+  }
+
+  def zeroPredSufConcreteUntilLemma[T](q: LazyConQ[T], suf: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    require(concreteUntil(q, suf, st) && zeroPreceedsSuf(q, suf, st))
+    zeroPreceedsLazy(q, st) && {
+      // induction scheme
+      if (q != suf) {
+        evalLazyConQS[T](q) match {
+          case Spine(Empty(), _, _) => true
+          case Spine(_, _, tail) =>
+            zeroPredSufConcreteUntilLemma(tail, suf, st)
+          case _ =>
+            true
+        }
+      } else true
+    }
+  } holds
+
+  def Pay[T](q: LazyConQ[T], scheds: Scheds[T], st: Set[LazyConQ[T]]): (Scheds[T], Set[LazyConQ[T]]) = {
+    require(strongSchedsProp(q, scheds, 1, st) && isEvaluated(q, st))
+    val (nschs, rst) = scheds match {
+      case c @ Cons(head, rest) =>
+        val (headval, st2) = evalLazyConQ(head, st)
+        (headval match {
+          case Spine(Empty(), createdWithSusp, rear) => // note: no other case is possible
+            if (createdWithSusp == True()) {
+              Cons(rear, rest)
+            } else
+              rest
+          //            In this case,
+          //              val prear = pushUntilZero(rear)
+          //            	concreteUntil(q, rhead, res._2) && concreteUntil(prear, rhead, st) && concreteUntil(rear, rhead, st) && schedulesProperty(prhead, rtail, st)
+        }, st2)
+      case Nil() =>
+        (scheds, st)
+    }
+    (nschs, rst)
+  } ensuring { res =>
+    val l = evalLazyConQS[T](q) match {
+      case Spine(_, _, rear) =>
+        rear
+      case _ => q
+    }
+    strongSchedsProp(q, res._1, 2, res._2) && // head of the schedule must start's after first 2
+      zeroPreceedsLazy(q, res._2) &&
+      (scheds match {
+        case Cons(head, rest) =>
+          concUntilExtenLemma(l, head, st) &&
+            (res._1 match {
+              case Cons(rhead, rtail) =>
+                val prhead = pushUntilZero(rhead)
+                schedMonotone(st, res._2, rtail, prhead) &&
+                  (evalLazyConQS(head) match {
+                    case Spine(Empty(), cws, rear) =>
+                      if (cws == False()) {
+                        concUntilMonotone(rear, rhead, st, res._2) &&
+                          concUntilCompose(l, rear, rhead, res._2)
+                      } else true
+                  })
+              case _ =>
+                evalLazyConQS(head) match {
+                  case Spine(Empty(), _, rear) =>
+                    concreteMonotone(st, res._2, rear) &&
+                      concUntilExtenLemma2(l, rear, res._2)
+                }
+            })
+        case _ => true
+      }) &&
+      // instantiations for zeroPreceedsLazy
+      (scheds match {
+        case Cons(head, rest) =>
+          //concUntil(l, head, st) == head && head* == Spine(Empty(), _) && res._2.subsetOf(st ++ { head })
+          concUntilZeroPredLemma(l, head, st)
+        case _ =>
+          concreteZeroPredLemma(q, res._2)
+      })
+  }
+
+  def pushLeftAndPay[T](ys: Single[T], w: Wrapper[T], st: Set[LazyConQ[T]]) = {
+    require(w.valid(st) && ys.isInstanceOf[Single[T]])
+    val (q, scheds, nst) = pushLeftWrapper(ys, w, st)
+    val (nscheds, fst) = Pay(q, scheds, nst)
+    (Wrapper(q, nscheds), fst)
+  } ensuring { res => res._1.valid(res._2) }
+
+  def lazyarg1[T](x: ConQ[T]): ConQ[T] = x
+}
+
+object ConQ {
+
+  abstract class LazyConQ[T1]
+
+  case class Eager[T](x: ConQ[T]) extends LazyConQ[T]
+
+  case class PushLeftLazy[T](ys: Conc[T], xs: LazyConQ[T] /*, suf: LazyConQ[T]*/ ) extends LazyConQ[T]
+
+  @library
+  def evalLazyConQ[T](cl: LazyConQ[T], st: Set[LazyConQ[T]]): (ConQ[T], Set[LazyConQ[T]]) = {
+    cl match {
+      case t: Eager[T] =>
+        (t.x, st)
+      case t: PushLeftLazy[T] =>
+        val (plres, plst) = pushLeftLazy[T](t.ys, t.xs, st)
+        //val plres = pushLeftLazy[T](t.ys, t.xs, uiState())._1
+        //val plst = pushLeftLazy[T](t.ys, t.xs, st)._2
+        //val plres = pushLeftLazyUI[T](t.ys, t.xs, uiState())._1
+        //val plst = pushLeftLazyUI[T](t.ys, t.xs, st)._2
+        (plres, (plst ++ Set[LazyConQ[T]](t)))
+    }
+  } ensuring { res =>
+    cl match {
+      case t : PushLeftLazy[T] =>
+        res._1 == pushLeftLazyVal(t.ys, t.xs)
+      case _ => true
+    }
+  }
+
+  def simpLemma[T](cl : LazyConQ[T], st: Set[LazyConQ[T]]) :  Boolean = {
+    evalLazyConQ(cl, st)._1 == evalLazyConQS(cl)
+  } holds
+
+  def isEvaluated[T](cl: LazyConQ[T], st: Set[LazyConQ[T]]) = st.contains(cl) || cl.isInstanceOf[Eager[T]]
+
+  def uiState[T](): Set[LazyConQ[T]] = ???[Set[LazyConQ[T]]]
+
+  def evalLazyConQS[T](cl: LazyConQ[T]): ConQ[T] = evalLazyConQ[T](cl, uiState())._1
+
+}
+
+// Cases that had to be considered for pushLeftWrapper
+      /*(w.schedule match {
+      case Cons(head, tail) => true
+        //strongSchedsProp(res._1, res._2, res._3)
+      case _ =>
+        res._2 match {
+          case Nil() => true
+            //strongSchedsProp(res._1, res._2, res._3)
+          case _ =>
+            strongSchedsProp(res._1, res._2, res._3)
+        }
+    }) &&*/
+      /*(//pushLeft(ys, w.queue, st)._1 match {
+      //case Spine(Empty(), createdWithSusp, _) if createdWithSusp == True() => true
+        /*(w.schedule match {
+          case Cons(head, tail) =>
+            schedulesProperty(res._1, res._2, res._3)
+          case _ => true
+        })*/
+      //case Spine(Empty(), _, _)  => true
+        /*(w.schedule match {
+          case Cons(head, tail) =>
+            schedulesProperty(res._1, res._2, res._3)
+          case _ => true
+        })*/
+      //case Spine(_, _, _) =>
+        /*(w.schedule match {
+          case Cons(head, tail) =>
+            schedulesProperty(res._1, res._2, res._3)
+          case _ => true
+        })*/
+      //case _ => true
+        //schedulesProperty(res._1, res._2, res._3)
+    })*/
+
+  /*def payLemma[T](q: LazyConQ[T], scheds: Scheds[T], st: Set[LazyConQ[T]]) = {
+    require(strongSchedsProp(q, scheds, st) && isEvaluated(q, st))
+    val res = Pay(q, scheds, st)
+    val l = evalLazyConQS[T](q) match {
+      case Spine(_, _, rear) =>
+        rear
+      case _ => q
+    }
+    strongSchedsProp(q, res._1, res._2) &&
+      zeroPreceedsLazy(q, res._2) &&
+      (scheds match {
+        case Cons(head, rest) =>
+          concUntilExtenLemma(l, head, st) &&
+            (res._1 match {
+              case Cons(rhead, rtail) =>
+                val prhead = pushUntilZero(rhead)
+                schedMonotone(st, res._2, rtail, prhead) &&
+                  (evalLazyConQS(head) match {
+                    case Spine(Empty(), cws, rear) =>
+                      if (cws == False()) {
+                        concUntilMonotone(rear, rhead, st, res._2) &&
+                          concUntilCompose(l, rear, rhead, res._2)
+                      } else true
+                  })
+              case _ =>
+                evalLazyConQS(head) match {
+                  case Spine(Empty(), _, rear) =>
+                    concreteMonotone(st, res._2, rear) &&
+                      concUntilExtenLemma2(l, rear, res._2)
+                }
+            })
+        case _ => true
+      }) &&
+      // instantiations for zeroPreceedsLazy
+      (scheds match {
+        case Cons(head, rest) =>
+          //concUntil(l, head, st) == head && head* == Spine(Empty(), _) && res._2.subsetOf(st ++ { head })
+          concUntilZeroPredLemma(l, head, st)
+        case _ =>
+          concreteZeroPredLemma(q, res._2)
+      })
+  } ensuring (_ == true)*/
diff --git a/testcases/lazy-datastructures/Conqueue-strategy3.scala b/testcases/lazy-datastructures/Conqueue-strategy3.scala
new file mode 100644
index 0000000000000000000000000000000000000000..4b00893850a8615501cb6cb863b16a0995a7167b
--- /dev/null
+++ b/testcases/lazy-datastructures/Conqueue-strategy3.scala
@@ -0,0 +1,698 @@
+import leon.lang._
+import leon.annotation._
+import leon.instrumentation._
+import leon.lang.synthesis._
+import ConcTrees._
+import ConQ._
+import Conqueue._
+
+object ConcTrees {
+  abstract class Conc[T] {
+    def isEmpty: Boolean = this == Empty[T]()
+
+    def level: BigInt = {
+      this match {
+        case Empty() =>
+          BigInt(0)
+        case Single(x) =>
+          BigInt(0)
+        case CC(l, r) =>
+          BigInt(1) + max(l.level, r.level)
+      }
+    } ensuring {
+      (x$1: BigInt) => x$1 >= BigInt(0)
+    }
+  }
+
+  case class Empty[T]() extends Conc[T]
+
+  case class Single[T](x: T) extends Conc[T]
+
+  case class CC[T](left: Conc[T], right: Conc[T]) extends Conc[T]
+
+  def max(x: BigInt, y: BigInt): BigInt = if (x >= y) {
+    x
+  } else {
+    y
+  }
+
+  def abs(x: BigInt): BigInt = if (x < BigInt(0)) {
+    -x
+  } else {
+    x
+  }
+}
+
+object Conqueue {
+  abstract class ConQ[T] {
+    def isSpine: Boolean = this match {
+      case Spine(_, _, _) =>
+        true
+      case _ =>
+        false
+    }
+    def isTip: Boolean = !this.isSpine
+  }
+
+  case class Tip[T](t: Conc[T]) extends ConQ[T]
+
+  case class Spine[T](head: Conc[T], createdWithSusp: Bool, rear: LazyConQ[T]) extends ConQ[T]
+
+  sealed abstract class Bool
+  case class True() extends Bool
+  case class False() extends Bool
+
+  abstract class Scheds[T]
+
+  case class Cons[T](h: LazyConQ[T], tail: Scheds[T]) extends Scheds[T]
+
+  case class Nil[T]() extends Scheds[T]
+
+  case class Wrapper[T](queue: LazyConQ[T], schedule: Scheds[T]) {
+    def valid(st: Set[LazyConQ[T]]): Boolean =
+      strongSchedsProp(this.queue, this.schedule, st)
+  }
+
+  def zeroPreceedsLazy[T](q: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    if (isEvaluated(q, st)) {
+      evalLazyConQS[T](q) match {
+        case Spine(Empty(), _, rear) => true
+        case Spine(_, _, rear) =>
+          zeroPreceedsLazy[T](rear, st)
+        case Tip(_) => true
+      }
+    } else false
+  }
+
+  // not used, but still interesting
+  def zeroPredLazyMonotone[T](st1: Set[LazyConQ[T]], st2: Set[LazyConQ[T]], q: LazyConQ[T]): Boolean = {
+    require(st1.subsetOf(st2) && zeroPreceedsLazy(q, st1))
+    zeroPreceedsLazy(q, st2) &&
+      //induction scheme
+      (evalLazyConQS[T](q) match {
+        case Spine(Empty(), _, _) =>
+          true
+        case Spine(h, _, rear) =>
+          zeroPredLazyMonotone(st1, st2, rear)
+        case Tip(_) =>
+          true
+      })
+  } holds
+
+  def weakZeroPreceedsLazy[T](q: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    evalLazyConQS[T](q) match {
+      case Spine(Empty(), _, rear10) =>
+        true
+      case Spine(h, _, rear11) =>
+        zeroPreceedsLazy[T](rear11, st)
+      case Tip(_) =>
+        true
+    }
+  }
+
+  def firstUnevaluated[T](l: LazyConQ[T], st: Set[LazyConQ[T]]): LazyConQ[T] = {
+    if (isEvaluated(l, st)) {
+      evalLazyConQS[T](l) match {
+        case Spine(_, _, tail15) =>
+          firstUnevaluated[T](tail15, st)
+        case _ =>
+          l
+      }
+    } else {
+      l
+    }
+  } ensuring {
+    (res65: LazyConQ[T]) =>
+      (evalLazyConQS[T](res65).isTip || !st.contains(res65)) &&
+        concreteUntil(l, res65, st)
+  }
+
+  def nextUnevaluatedLemma[T](l: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    val (res, nst) = evalLazyConQ[T](firstUnevaluated(l, st), st)
+    (res match {
+      case Spine(_, _, tail) =>
+        firstUnevaluated[T](l, nst) == firstUnevaluated[T](tail, nst)
+      case _ =>
+        true
+    }) &&
+      // induction scheme
+      (evalLazyConQS[T](l) match {
+        case Spine(_, _, tail) =>
+          nextUnevaluatedLemma(tail, st)
+        case _ => true
+      })
+  } holds
+
+  /**
+   * Everything until suf is evaluated
+   */
+  def concreteUntil[T](l: LazyConQ[T], suf: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    if (l != suf) {
+      isEvaluated(l, st) && (evalLazyConQS[T](l) match {
+        case Spine(_, cws, tail15) =>
+          concreteUntil(tail15, suf, st)
+        case _ =>
+          false // suf should in the queue
+      })
+    } else true
+  }
+
+  def concreteUntilIsSuffix[T](l: LazyConQ[T], suf: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    require(concreteUntil(l, suf, st))
+    suffix(l, suf) &&
+     // induction scheme
+    (if (l != suf) {
+      (evalLazyConQS[T](l) match {
+        case Spine(_, cws, tail15) =>
+          concreteUntilIsSuffix(tail15, suf, st)
+        case _ =>
+          true
+      })
+    } else true)
+  }.holds
+
+  def isConcrete[T](l: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = isEvaluated(l, st) && (evalLazyConQS[T](l) match {
+    case Spine(_, _, tail13) =>
+      isConcrete[T](tail13, st)
+    case _ =>
+      true
+  })
+
+  def schedulesProperty[T](q: LazyConQ[T], schs: Scheds[T], st: Set[LazyConQ[T]]): Boolean = {
+    schs match {
+      case Cons(head, tail) =>
+        evalLazyConQS[T](head) match {
+          case Spine(Empty(), _, _) =>
+            !head.isInstanceOf[Eager[T]] &&
+              concreteUntil(q, head, st) && schedulesProperty[T](pushUntilZero[T](head), tail, st)
+          case _ =>
+            false
+        }
+      case Nil() =>
+        isConcrete(q, st)
+    }
+  }
+
+  def strongSchedsProp[T](q: LazyConQ[T], schs: Scheds[T], st: Set[LazyConQ[T]]): Boolean = {
+    isEvaluated(q, st) && {
+      schs match {
+        case Cons(head, tail) =>
+          zeroPreceedsSuf(q, head) // zeroPreceedsSuf holds initially
+        case Nil() => true
+      }
+    } &&
+      schedulesProperty(q, schs, st)
+  }
+
+  // pushes a carry until there is a one
+  // TODO: rename this to pushUntilCarry
+  def pushUntilZero[T](q: LazyConQ[T]): LazyConQ[T] = evalLazyConQS[T](q) match {
+    case Spine(Empty(), _, rear12) =>
+      pushUntilZero[T](rear12)
+    case Spine(h, _, rear13) =>
+      rear13
+    case Tip(_) =>
+      q
+  }
+
+  // not used as of now
+  def pushUntilZeroIsSuffix[T](q: LazyConQ[T]): Boolean = {
+    suffix(q, pushUntilZero(q)) &&
+    (evalLazyConQS[T](q) match {
+      case Spine(Empty(), _, rear12) =>
+        pushUntilZeroIsSuffix(rear12)
+      case Spine(h, _, rear13) =>
+        true
+      case Tip(_) =>
+        true
+    })
+  }.holds
+
+  def pushLeft[T](ys: Single[T], xs: LazyConQ[T], st: Set[LazyConQ[T]]): (ConQ[T], Set[LazyConQ[T]]) = {
+    require(zeroPreceedsLazy[T](xs, st) && ys.isInstanceOf[Single[T]])
+
+    val dres5 = evalLazyConQ[T](xs, st)
+    dres5._1 match {
+      case Tip(CC(_, _)) =>
+        (Spine[T](ys, False(), xs), dres5._2)
+      case Tip(Empty()) =>
+        (Tip[T](ys), dres5._2)
+      case Tip(t @ Single(_)) =>
+        (Tip[T](CC[T](ys, t)), dres5._2)
+      case s @ Spine(Empty(), _, rear) =>
+        (Spine[T](ys, False(), rear), dres5._2) // in this case firstUnevaluated[T](rear, st) == firstUnevaluated[T](xs, st)
+      case s @ Spine(_, _, _) =>
+        pushLeftLazy[T](ys, xs, dres5._2)
+    }
+  }
+
+/*  def dummyFun[T]() = ???[(ConQ[T], Set[LazyConQ[T]])]
+
+  @library
+  def pushLeftLazyUI[T](ys: Conc[T], xs: LazyConQ[T], st: Set[LazyConQ[T]]): (ConQ[T], Set[LazyConQ[T]]) = {
+    dummyFun()
+  } ensuring (res => res._2 == st && (res._1 match {
+    case Spine(Empty(), createdWithSusp, rear) =>
+      true
+    case _ => false
+  }))*/
+
+  def pushLeftLazyVal[T](ys: Conc[T], xs: LazyConQ[T]) : ConQ[T] = ???[ConQ[T]]
+
+  @library
+  def resSpec[T](ys: Conc[T], xs: LazyConQ[T], res : ConQ[T]) = {
+    res == pushLeftLazyVal(ys, xs)
+  } holds
+
+  def pushLeftLazy[T](ys: Conc[T], xs: LazyConQ[T], st: Set[LazyConQ[T]]): (ConQ[T], Set[LazyConQ[T]]) = {
+    require(!ys.isEmpty && zeroPreceedsLazy[T](xs, st) &&
+      (evalLazyConQS(xs) match {
+        case Spine(h, _, _) => h != Empty[T]()
+        case _              => false
+      }))
+    val dres = evalLazyConQ[T](xs, st)
+    val res = dres._1 match {
+      case Spine(head, _, rear15) =>
+        val carry = CC[T](head, ys)
+        val dres2 = evalLazyConQ[T](rear15, dres._2)
+        dres2._1 match {
+          case s @ Spine(Empty(), _, srear) =>
+            (Spine[T](Empty[T](), False(), Eager(Spine(carry, False(), srear))), dres2._2)
+          case s @ Spine(_, _, _) =>
+            (Spine[T](Empty[T](), True(), PushLeftLazy[T](carry, rear15)), dres2._2)
+          case t @ Tip(tree) =>
+            if (tree.level > carry.level) { // this case may not even be possible given additional preconditions
+              val x: ConQ[T] = t
+              (Spine[T](Empty[T](), False(), Eager(Spine(carry, False(), rear15))), dres2._2)
+            } else { // here tree level and carry level are equal
+              val x: ConQ[T] = Tip[T](CC[T](tree, carry))
+              (Spine[T](Empty[T](), False(), Eager(Spine(Empty[T](), False(), Eager[T](x)))), dres2._2)
+            }
+        }
+    }
+    res
+  } ensuring {
+    (res66: (Spine[T], Set[LazyConQ[T]])) =>
+      resSpec(ys, xs, res66._1) &&  // asserts that the result is a function of inputs excluding state
+      (res66._2 == st) && (res66._1 match {
+        case Spine(Empty(), createdWithSusp, rear) =>
+          val (rearval, _) = evalLazyConQ[T](rear, st) // this is necessary to assert properties on the state in the recursive invocation
+          (!isConcrete(xs, st) || isConcrete(pushUntilZero(rear), st))
+          //true
+        case _ =>
+          false
+      })
+  }
+
+  def pushLeftLazyLemma[T](ys: Conc[T], xs: LazyConQ[T], suf: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    require(!ys.isEmpty && zeroPreceedsSuf(xs, suf) &&
+      (evalLazyConQS(xs) match {
+        case Spine(h, _, _) => h != Empty[T]() // this implies xs.rear is also evaluated
+        case _              => false
+      }) &&
+      (evalLazyConQS(suf) match {
+        case Spine(Empty(), _, _) =>
+          concreteUntil(xs, suf, st)
+        case _ => false
+      }))
+    // instantiate the lemma
+    zeroPredSufConcreteUntilLemma(xs, suf, st) &&
+    // property
+    (pushLeftLazy(ys, xs, st)._1 match {
+      case Spine(Empty(), createdWithSusp, rear) =>
+        // forall suf. suf*.head == Empty() ^ concUntil(xs, suf, st) => concUntil(push(rear), suf, st)
+        val p = pushUntilZero[T](rear)
+        concreteUntil(p, suf, st)
+    }) &&
+      // induction scheme
+      (evalLazyConQS(xs) match {
+        case Spine(head, _, rear15) =>
+          val carry = CC[T](head, ys)
+          evalLazyConQS(rear15) match {
+            case s @ Spine(h, _, _) if h != Empty[T]() =>
+              pushLeftLazyLemma(carry, rear15, suf, st)
+            case _ => true
+          }
+      })
+  } holds
+
+  def pushLeftWrapper[T](ys: Single[T], w: Wrapper[T], st: Set[LazyConQ[T]]) = {
+    require(w.valid(st) && ys.isInstanceOf[Single[T]] &&
+        // instantiate the lemma that implies zeroPreceedsLazy
+        { w.schedule match {
+          case Cons(h, _) =>
+            zeroPredSufConcreteUntilLemma(w.queue, h, st)
+          case _ =>
+            concreteZeroPredLemma(w.queue, st)
+        }})
+    val (nq, nst) = pushLeft[T](ys, w.queue, st)
+    val nsched = nq match {
+      case Spine(Empty(), createdWithSusp, rear18) if createdWithSusp == True() =>
+        Cons[T](rear18, w.schedule) // this is the only case where we create a new lazy closure
+      case _ =>
+        w.schedule
+    }
+    (Eager(nq), nsched, nst)
+  } ensuring { res =>
+    isEvaluated(res._1, res._3) &&
+    schedulesProperty(res._1, res._2, res._3) && // head of the schedule may start after the first element
+      // lemma instantiations
+      (w.schedule match {
+        case Cons(head, tail) =>
+          evalLazyConQS(w.queue) match {
+            case Spine(h, _, _) if h != Empty[T]() =>
+              pushLeftLazyLemma(ys, w.queue, head, st)
+            case _ => true
+          }
+        case _ => true
+      })
+  }
+
+  /*def PushLeftLazypushLeftLazyLem[T](rear15 : LazyConQ[T], head : Conc[T], dres : (ConQ[T], Set[LazyConQ[T]]), st : Set[LazyConQ[T]], xs : LazyConQ[T], s : Spine[T], dres : (ConQ[T], Set[LazyConQ[T]]), carry : CC[T], ys : Conc[T]): Boolean =  {
+    (!ys.isEmpty && zeroPreceedsLazy[T](xs, st) && evalLazyConQS[T](xs).isSpine && dres == evalLazyConQ[T](xs, st) && (!dres._1.isInstanceOf[Spine[T]] || !dres._1.head.isInstanceOf[Empty[T]]) && dres._1.isInstanceOf[Spine[T]] && head == dres._1.head && rear15 == dres._1.rear && carry == CC[T](head, ys) && dres == evalLazyConQ[T](rear15, dres._2) && (!dres._1.isInstanceOf[Spine[T]] || !dres._1.head.isInstanceOf[Empty[T]]) && dres._1.isInstanceOf[Spine[T]] && s == dres._1) ==> (!carry.isEmpty && zeroPreceedsLazy[T](rear15, dres._2) && evalLazyConQS[T](rear15).isSpine)
+  } ensuring {
+    (holds : Boolean) => holds
+  }*/
+
+  def schedMonotone[T](st1: Set[LazyConQ[T]], st2: Set[LazyConQ[T]], scheds: Scheds[T], l: LazyConQ[T]): Boolean = {
+    require(st1.subsetOf(st2) && schedulesProperty(l, scheds, st1))
+    schedulesProperty(l, scheds, st2) && //property
+      //induction scheme
+      (scheds match {
+        case Cons(head, tail) =>
+          evalLazyConQS[T](head) match {
+            case Spine(_, _, rear) =>
+              concUntilMonotone(l, head, st1, st2) &&
+                schedMonotone(st1, st2, tail, pushUntilZero(head))
+            case _ => true
+          }
+        case Nil() =>
+          concreteMonotone(st1, st2, l)
+      })
+  } holds
+
+  def concreteMonotone[T](st1: Set[LazyConQ[T]], st2: Set[LazyConQ[T]], l: LazyConQ[T]): Boolean = {
+    require(isConcrete(l, st1) && st1.subsetOf(st2))
+    isConcrete(l, st2) && {
+      // induction scheme
+      evalLazyConQS[T](l) match {
+        case Spine(_, _, tail) =>
+          concreteMonotone[T](st1, st2, tail)
+        case _ =>
+          true
+      }
+    }
+  } holds
+
+  def concUntilExtenLemma[T](q: LazyConQ[T], suf: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    require(concreteUntil(q, suf, st))
+    val (next, nst) = evalLazyConQ[T](suf, st)
+    (next match {
+      case Spine(_, _, rear) =>
+        concreteUntil(q, rear, nst)
+      case _ => true
+    }) &&
+      (if (q != suf) {
+        evalLazyConQS[T](q) match {
+          case Spine(_, _, tail) =>
+            concUntilExtenLemma(tail, suf, st)
+          case _ =>
+            true
+        }
+      } else true)
+  } holds
+
+  def concUntilExtenLemma2[T](q: LazyConQ[T], suf: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    require(concreteUntil(q, suf, st) && isConcrete(suf, st))
+    isConcrete(q, st) &&
+      (if (q != suf) {
+        evalLazyConQS[T](q) match {
+          case Spine(_, _, tail) =>
+            concUntilExtenLemma2(tail, suf, st)
+          case _ =>
+            true
+        }
+      } else true)
+  } ensuring (_ == true)
+
+  def concUntilCompose[T](q: LazyConQ[T], suf1: LazyConQ[T], suf2: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    require(concreteUntil(q, suf1, st) && concreteUntil(suf1, suf2, st))
+    concreteUntil(q, suf2, st) &&
+      (if (q != suf1) {
+        evalLazyConQS[T](q) match {
+          case Spine(_, _, tail) =>
+            concUntilCompose(tail, suf1, suf2, st)
+          case _ =>
+            true
+        }
+      } else true)
+  } ensuring (_ == true)
+
+  def concUntilMonotone[T](q: LazyConQ[T], suf: LazyConQ[T], st1: Set[LazyConQ[T]], st2: Set[LazyConQ[T]]): Boolean = {
+    require(concreteUntil(q, suf, st1) && st1.subsetOf(st2))
+    concreteUntil(q, suf, st2) &&
+      (if (q != suf) {
+        evalLazyConQS[T](q) match {
+          case Spine(_, _, tail) =>
+            concUntilMonotone(tail, suf, st1, st2)
+          case _ =>
+            true
+        }
+      } else true)
+  } ensuring (_ == true)
+
+  /**
+   * Not used in the proof
+   */
+  def concUntilZeroPredLemma[T](q: LazyConQ[T], suf: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    require(concreteUntil(q, suf, st) && (evalLazyConQS(suf) match {
+      case Spine(Empty(), _, _) => true
+      case _                    => false
+    }))
+    val (next, nst) = evalLazyConQ[T](suf, st)
+    zeroPreceedsLazy(q, nst) &&
+      (if (q != suf) {
+        evalLazyConQS[T](q) match {
+          case Spine(_, _, tail) =>
+            concUntilZeroPredLemma(tail, suf, st)
+          case _ =>
+            true
+        }
+      } else true)
+  } holds
+
+  def concreteZeroPredLemma[T](q: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    require(isConcrete(q, st))
+    zeroPreceedsLazy(q, st) && {
+      // induction scheme
+      evalLazyConQS[T](q) match {
+        case Spine(_, _, tail) =>
+          concreteZeroPredLemma[T](tail, st)
+        case _ =>
+          true
+      }
+    }
+  } holds
+
+  def zeroPreceedsSuf[T](q: LazyConQ[T], suf: LazyConQ[T]): Boolean = {
+    if (q != suf) {
+      evalLazyConQS[T](q) match {
+        case Spine(Empty(), _, rear) => true
+        case Spine(_, _, rear) =>
+          zeroPreceedsSuf(rear, suf)
+        case Tip(_) => false
+      }
+    } else false
+  }
+
+  def zeroPredSufConcreteUntilLemma[T](q: LazyConQ[T], suf: LazyConQ[T], st: Set[LazyConQ[T]]): Boolean = {
+    require(concreteUntil(q, suf, st) && zeroPreceedsSuf(q, suf))
+    zeroPreceedsLazy(q, st) && {
+      // induction scheme
+      if (q != suf) {
+        evalLazyConQS[T](q) match {
+          case Spine(Empty(), _, _) => true
+          case Spine(_, _, tail) =>
+            zeroPredSufConcreteUntilLemma(tail, suf, st)
+          case _ =>
+            true
+        }
+      } else true
+    }
+  } holds
+
+  def Pay[T](q: LazyConQ[T], scheds: Scheds[T], st: Set[LazyConQ[T]]): (Scheds[T], Set[LazyConQ[T]]) = {
+    require(schedulesProperty(q, scheds, st) && isEvaluated(q, st))
+    val (nschs, rst) = scheds match {
+      case c @ Cons(head, rest) =>
+        val (headval, st2) = evalLazyConQ(head, st)
+        (headval match {
+          case Spine(Empty(), createdWithSusp, rear) => // note: no other case is possible
+            if (createdWithSusp == True()) {
+              Cons(rear, rest)
+            } else
+              rest
+          //            In this case,
+          //              val prear = pushUntilZero(rear)
+          //            	concreteUntil(q, rhead, res._2) && concreteUntil(prear, rhead, st) && concreteUntil(rear, rhead, st) && schedulesProperty(prhead, rtail, st)
+        }, st2)
+      case Nil() =>
+        (scheds, st)
+    }
+    (nschs, rst)
+  } ensuring { res =>
+    schedulesProperty(q, res._1, res._2) &&
+    //strongSchedsProp(q, res._1, res._2) &&
+      (scheds match {
+        case Cons(head, rest) =>
+          concUntilExtenLemma(q, head, st) &&
+            (res._1 match {
+              case Cons(rhead, rtail) =>
+                val prhead = pushUntilZero(rhead)
+                schedMonotone(st, res._2, rtail, prhead) &&
+                  (evalLazyConQS(head) match {
+                    case Spine(Empty(), cws, rear) =>
+                      if (cws == False()) {
+                        concUntilMonotone(rear, rhead, st, res._2) &&
+                          concUntilCompose(q, rear, rhead, res._2)
+                      } else true
+                  })
+              case _ =>
+                evalLazyConQS(head) match {
+                  case Spine(Empty(), _, rear) =>
+                    concreteMonotone(st, res._2, rear) &&
+                      concUntilExtenLemma2(q, rear, res._2)
+                }
+            })
+        case _ => true
+      }) &&
+      // instantiations for zeroPreceedsSuf
+      (scheds match {
+        case Cons(head, rest) =>
+          val phead = pushUntilZero(head)
+          concreteUntilIsSuffix(q, head, st) &&
+          (res._1 match {
+            case Cons(rhead, rtail) =>
+              concreteUntilIsSuffix(phead, rhead, st) &&
+                suffixZeroLemma(q, head, rhead) &&
+                zeroPreceedsSuf(q, rhead)
+              //suffix(l, head) && head* == Spine(Empty(), _) && suffix(head, rhead) ==> zeroPreceedsSuf(l, rhead)
+            case _ =>
+              true
+          })
+        case _ =>
+          true
+      })
+  }
+
+  def suffix[T](q: LazyConQ[T], suf: LazyConQ[T]): Boolean = {
+    if(q == suf) true
+    else {
+      evalLazyConQS(q) match {
+        case Spine(_, _, rear) =>
+          suffix(rear, suf)
+        case Tip(_) => false
+      }
+    }
+  }
+
+  def suffixCompose[T](q: LazyConQ[T], suf1: LazyConQ[T],  suf2: LazyConQ[T]): Boolean = {
+    require(suffix(q,suf1) && properSuffix(suf1, suf2))
+    properSuffix(q, suf2) &&
+    // induction over suffix(q, suf1)
+    (if(q == suf1) true
+    else {
+      evalLazyConQS(q) match {
+        case Spine(_, _, rear) =>
+          suffixCompose(rear, suf1, suf2)
+        case Tip(_) => false
+      }
+    })
+  }.holds
+
+  // TODO: this should be inferrable from the model
+  @library
+  def properSuffix[T](l: LazyConQ[T], suf: LazyConQ[T]) : Boolean = {
+    evalLazyConQS(l) match {
+      case Spine(_, _, rear) =>
+        suffix(rear, suf)
+      case _ => false
+    }
+  } ensuring(res => !res || suf != l)
+
+  // uncomment this once the model is fixed
+  /*def suffixInEquality[T](l: LazyConQ[T], suf: LazyConQ[T], suf2: ) : Boolean = {
+    require(properSuffix(l, suf))
+    (l != suf) && (
+      evalLazyConQS(l) match {
+        case Spine(_, _, rear) =>
+          suffixInEquality(rear, suf)
+        case _ => false
+      })
+  }.holds*/
+
+  def suffixZeroLemma[T](q: LazyConQ[T], suf: LazyConQ[T], suf2: LazyConQ[T]): Boolean = {
+    require(evalLazyConQS(suf) match {
+      case Spine(Empty(), _, _) =>
+        suffix(q, suf) && properSuffix(suf, suf2)
+      case _ => false
+    })
+    suffixCompose(q, suf, suf2) &&
+    zeroPreceedsSuf(q, suf2) && (
+        // induction scheme
+      if (q != suf) {
+        evalLazyConQS[T](q) match {
+          case Spine(_, _, tail) =>
+            suffixZeroLemma(tail, suf, suf2)
+          case _ =>
+            true
+        }
+      } else true)
+  }.holds
+
+  def pushLeftAndPay[T](ys: Single[T], w: Wrapper[T], st: Set[LazyConQ[T]]) = {
+    require(w.valid(st) && ys.isInstanceOf[Single[T]])
+    val (q, scheds, nst) = pushLeftWrapper(ys, w, st)
+    val (nscheds, fst) = Pay(q, scheds, nst)
+    (Wrapper(q, nscheds), fst)
+  } ensuring { res => res._1.valid(res._2) }
+
+  def lazyarg1[T](x: ConQ[T]): ConQ[T] = x
+}
+
+object ConQ {
+
+  abstract class LazyConQ[T1]
+
+  case class Eager[T](x: ConQ[T]) extends LazyConQ[T]
+
+  case class PushLeftLazy[T](ys: Conc[T], xs: LazyConQ[T] /*, suf: LazyConQ[T]*/ ) extends LazyConQ[T]
+
+  @library
+  def evalLazyConQ[T](cl: LazyConQ[T], st: Set[LazyConQ[T]]): (ConQ[T], Set[LazyConQ[T]]) = {
+    cl match {
+      case t: Eager[T] =>
+        (t.x, st)
+      case t: PushLeftLazy[T] =>
+        val (plres, plst) = pushLeftLazy[T](t.ys, t.xs, st)
+        (plres, (plst ++ Set[LazyConQ[T]](t)))
+    }
+  } ensuring { res =>
+    (cl match {
+      case t : PushLeftLazy[T] =>
+        res._1 == pushLeftLazyVal(t.ys, t.xs)
+      case _ => true
+    })
+  }
+
+  def simpLemma[T](cl : LazyConQ[T], st: Set[LazyConQ[T]]) :  Boolean = {
+    evalLazyConQ(cl, st)._1 == evalLazyConQS(cl)
+  }.holds
+
+  def isEvaluated[T](cl: LazyConQ[T], st: Set[LazyConQ[T]]) = st.contains(cl) || cl.isInstanceOf[Eager[T]]
+
+  def uiState[T](): Set[LazyConQ[T]] = ???[Set[LazyConQ[T]]]
+
+  def evalLazyConQS[T](cl: LazyConQ[T]): ConQ[T] = evalLazyConQ[T](cl, uiState())._1
+
+}
diff --git a/testcases/lazy-datastructures/Conqueue.scala b/testcases/lazy-datastructures/Conqueue.scala
index 51e5a827ab6e8c8bdf94a030b20acf57daa1d8d4..3d5ea675fde56caee5eae3423873ec29c79cb675 100644
--- a/testcases/lazy-datastructures/Conqueue.scala
+++ b/testcases/lazy-datastructures/Conqueue.scala
@@ -2,26 +2,27 @@ package lazybenchmarks
 
 import leon.lazyeval._
 import leon.lang._
+import leon.math._
 import leon.annotation._
 import leon.instrumentation._
+import leon.lazyeval.$._
 
 object ConcTrees {
 
-  def max(x: BigInt, y: BigInt): BigInt = if (x >= y) x else y
-  def abs(x: BigInt): BigInt = if (x < 0) -x else x
-
   sealed abstract class Conc[T] {
     def isEmpty: Boolean = {
       this == Empty[T]()
     }
 
+    // Note: instrumentation phase is unsound in the handling of fields as it does not its cost in object construction.
+    // Fix this.
     val level: BigInt = {
-      (this match {
-        case Empty()   => 0
-        case Single(x) => 0
+      this match {
+        case Empty()   => BigInt(0)
+        case Single(x) => BigInt(0)
         case CC(l, r) =>
-          1 + max(l.level, r.level)
-      }): BigInt
+          BigInt(1) + max(l.level, r.level)
+      }
     } ensuring (_ >= 0)
   }
 
@@ -33,82 +34,72 @@ object ConcTrees {
 import ConcTrees._
 object Conqueue {
 
+  sealed abstract class ConQ[T] {
+    val isSpine: Boolean = this match {
+      case Spine(_, _, _) => true
+      case _              => false
+    }
+    val isTip = !isSpine
+  }
+
+  case class Tip[T](t: Conc[T]) extends ConQ[T]
+  case class Spine[T](head: Conc[T], createdWithSuspension: Bool, rear: $[ConQ[T]]) extends ConQ[T]
+
+  sealed abstract class Bool
+  case class True() extends Bool
+  case class False() extends Bool
+
+  /**
+   * Checks whether there is a zero before an unevaluated closure
+   */
   def zeroPreceedsLazy[T](q: $[ConQ[T]]): Boolean = {
     if (q.isEvaluated) {
       q* match {
-        case Spine(Empty(), rear) =>
+        case Spine(Empty(), _, rear) =>
           true // here we have seen a zero
-        case Spine(h, rear) =>
+        case Spine(h, _, rear) =>
           zeroPreceedsLazy(rear) //here we have not seen a zero
         case Tip(_) => true
       }
-    } else false // this implies that a ConQ cannot start with a lazy closure
-  } //ensuring (res => !res || weakZeroPreceedsLazy(q)) //zeroPreceedsLazy is a stronger property
-
-  /*def weakZeroPreceedsLazy[T](q: $[ConQ[T]]): Boolean = {
-    q* match {
-      case Spine(Empty(), rear) =>
-        weakZeroPreceedsLazy(rear)
-      case Spine(h, rear) =>
-        zeroPreceedsLazy(rear) //here we have not seen a zero
-      case Tip(_) => true
-    }
-  }*/
-
-  def isConcrete[T](l: $[ConQ[T]]): Boolean = {
-    (l.isEvaluated && (l* match {
-      case Spine(_, tail) =>
-        isConcrete(tail)
-      case _ => true
-    })) || (l*).isTip
+    } else false
   }
 
-  // an axiom about lazy streams (this should be a provable axiom, but not as of now)
-  @axiom
-  def streamLemma[T](l: $[ConQ[T]]) = {
-    l.isEvaluated ||
-      (l* match {
-        case Spine(_, tail) =>
-          l != tail && !tail.isEvaluated
-        case _ => true
-      })
-  } holds
-
-  def firstUnevaluated[T](l: $[ConQ[T]]): $[ConQ[T]] = {
-    if (l.isEvaluated) {
-      l* match {
-        case Spine(_, tail) =>
-          firstUnevaluated(tail)
-        case _ => l
+  /**
+   * Checks whether there is a zero before a given suffix
+   */
+  def zeroPreceedsSuf[T](q: $[ConQ[T]], suf: $[ConQ[T]]): Boolean = {
+    if (q != suf) {
+      q* match {
+        case Spine(Empty(), _, rear) => true
+        case Spine(_, _, rear) =>
+          zeroPreceedsSuf(rear, suf)
+        case Tip(_) => false
       }
-    } else
-      l
-  } ensuring (res => ((res*).isSpine || isConcrete(l)) && //if there are no lazy closures then the stream is concrete
-    ((res*).isTip || !res.isEvaluated) && // if the return value is not a Tip closure then it would not have been evaluated
-    streamLemma(res)  &&
-    (res.value match {
-      case Spine(_, tail) =>
-        firstUnevaluated(l) == tail // after evaluating the firstUnevaluated closure in 'l' we get the next unevaluated closure
-      case _ => true
-    }))
-
-  def nextUnevaluated[T](l: $[ConQ[T]]) : $[ConQ[T]] = {
-    l* match {
-      case Spine(_, tail) => firstUnevaluated(tail)
-      case _ => l
-    }
+    } else false
   }
 
-  sealed abstract class ConQ[T] {
-    val isSpine: Boolean = this match {
-      case Spine(_, _) => true
-      case _           => false
-    }
-    val isTip = !isSpine
+  /**
+   * Everything until suf is evaluated. This
+   * also asserts that suf should be a suffix of the list
+   */
+  def concreteUntil[T](l: $[ConQ[T]], suf: $[ConQ[T]]): Boolean = {
+    if (l != suf) {
+      l.isEvaluated && (l* match {
+        case Spine(_, cws, tail) =>
+          concreteUntil(tail, suf)
+        case _ =>
+          false
+      })
+    } else true
   }
 
-  case class Tip[T](t: Conc[T]) extends ConQ[T]
-  case class Spine[T](head: Conc[T], rear: $[ConQ[T]]) extends ConQ[T]
+  def isConcrete[T](l: $[ConQ[T]]): Boolean = {
+    l.isEvaluated && (l* match {
+      case Spine(_, _, tail) =>
+        isConcrete(tail)
+      case _ => true
+    })
+  }
 
   sealed abstract class Scheds[T]
   case class Cons[T](h: $[ConQ[T]], tail: Scheds[T]) extends Scheds[T]
@@ -117,183 +108,459 @@ object Conqueue {
   def schedulesProperty[T](q: $[ConQ[T]], schs: Scheds[T]): Boolean = {
     schs match {
       case Cons(head, tail) =>
-        (head*).isSpine &&
-        (firstUnevaluated(q) == head) && { //sch is the first lazy closure of 's'
-          schedulesProperty(pushUntilZero(head), tail)
+        head* match {
+          case Spine(Empty(), _, _) =>
+            head.isSuspension(pushLeftLazy[T] _) &&
+              concreteUntil(q, head) &&
+              schedulesProperty(pushUntilCarry(head), tail)
+          case _ =>
+            false
         }
       case Nil() =>
         isConcrete(q)
     }
   }
 
-  def pushUntilZero[T](q: $[ConQ[T]]) : $[ConQ[T]] = {
+  def strongSchedsProp[T](q: $[ConQ[T]], schs: Scheds[T]) = {
+    q.isEvaluated && {
+      schs match {
+        case Cons(head, tail) =>
+          zeroPreceedsSuf(q, head) // zeroPreceedsSuf holds initially
+        case Nil() => true
+      }
+    } &&
+      schedulesProperty(q, schs)
+  }
+
+  /**
+   * Note: if 'q' has a suspension then it would have a carry.
+   */
+  def pushUntilCarry[T](q: $[ConQ[T]]): $[ConQ[T]] = {
     q* match {
-      case Spine(Empty(), rear) =>
-        // here, rear would be a new unevaluated closure
-        pushUntilZero(rear)
-      case Spine(h, rear) =>
-        //here, the first unevaluated closure is fully pushed
-        // rear could either be the unevaluated or evaluated
+      case Spine(Empty(), _, rear) => // if we push a carry and get back 0 then there is a new carry
+        pushUntilCarry(rear)
+      case Spine(h, _, rear) => // if we push a carry and get back 1 then there the carry has been fully pushed
         rear
       case Tip(_) =>
         q
     }
   }
 
-  // this is a temporary property
-  /*def weakSchedulesProperty[T](q: $[ConQ[T]], schs: Scheds[T]): Boolean = {
-    schs match {
-      case Cons(head, tail) =>
-        firstUnevaluated(q) == head && {
-          val _ = head.value // evaluate the head and on the evaluated state the following should hold
-          schedulesProperty(head, tail)
-        }
-      case Nil() =>
-        isConcrete(q)
-    }
-  }*/
-
-  case class Wrapper[T](queue: $[ConQ[T]], schedule: Scheds[T]) {
-    val valid: Boolean = {
-      zeroPreceedsLazy(queue) && schedulesProperty(queue, schedule)
-    }
+  case class Queue[T](queue: $[ConQ[T]], schedule: Scheds[T]) {
+    val valid = strongSchedsProp(queue, schedule)
   }
 
   def pushLeft[T](ys: Single[T], xs: $[ConQ[T]]): ConQ[T] = {
     require(zeroPreceedsLazy(xs))
     xs.value match {
       case Tip(CC(_, _)) =>
-        Spine(ys, xs)
+        Spine(ys, False(), xs)
       case Tip(Empty()) =>
         Tip(ys)
       case Tip(t @ Single(_)) =>
         Tip(CC(ys, t))
-      case s@Spine(_, _) =>
-        pushLeftLazy(ys, xs) //ensure precondition here
+      case s @ Spine(Empty(), _, rear) =>
+        Spine[T](ys, False(), rear)
+      case s @ Spine(_, _, _) =>
+        pushLeftLazy(ys, xs)
     }
-  }
+  } ensuring (_ => time <= 70)
 
-  // this procedure does not change state (can this be automatically inferred)
+  // this procedure does not change state
+  // TODO: can `invstate` annotations be automatically inferred
   @invstate
   def pushLeftLazy[T](ys: Conc[T], xs: $[ConQ[T]]): ConQ[T] = {
-    require(!ys.isEmpty && zeroPreceedsLazy(xs) && (xs*).isSpine)
-    //(xs.head.isEmpty || xs.head.level == ys.level))
-    xs.value match { //xs.value is evaluated by the 'zeroPreceedsLazy' invariant
-      case Spine(Empty(), rear) =>
-        Spine(ys, rear) // if rear is an unevaluated 'pushLazy', this would temporarily break the 'zeroPreceedsLazy' invariant
-      case Spine(head, rear) => // here, rear has to be evaluated by 'zeroPreceedsLazy' invariant
+    require(!ys.isEmpty && zeroPreceedsLazy(xs) &&
+      (xs* match {
+        case Spine(h, _, _) => h != Empty[T]()
+        case _              => false
+      }))
+    //an additional precondition that is necessary for correctness: xs.head.level == ys.level
+    xs.value match {
+      case Spine(head, _, rear) => // here, rear is guaranteed to be evaluated by 'zeroPreceedsLazy' invariant
         val carry = CC(head, ys) //here, head and ys are of the same level
         rear.value match {
-          /*case s @ Spine(Empty(), srear) =>
-            val x : ConQ[T] = Spine(carry, srear)
-            Spine(Empty(), $(x))*/
+          case s @ Spine(Empty(), _, srear) =>
+            val tail: ConQ[T] = Spine(carry, False(), srear)
+            Spine(Empty(), False(), tail)
 
-          case s @ Spine(_, _) =>
-            Spine(Empty(), $(pushLeftLazy(carry, rear)))
+          case s @ Spine(_, _, _) =>
+            Spine(Empty(), True(), $(pushLeftLazy(carry, rear)))
 
           case t @ Tip(tree) =>
-            val x : ConQ[T] = t
-            Spine(Empty(), $(x))
-            /*if (tree.level > carry.level) { // can this happen ? this means tree is of level at least two greater than rear ?
-              val x : ConQ[T] = t
-              val y : ConQ[T] = Spine(carry, $(x))
-              Spine(Empty(), $(y))
+            if (tree.level > carry.level) { // can this happen ? this means tree is of level at least two greater than rear ?
+              val y: ConQ[T] = Spine(carry, False(), rear)
+              Spine(Empty(), False(), y)
+            } else { // here tree level and carry level are equal
+              val x: ConQ[T] = Tip(CC(tree, carry))
+              val y: ConQ[T] = Spine(Empty(), False(), x)
+              Spine(Empty(), False(), y)
             }
-            else {// here tree level and carry level are equal
-              val x : ConQ[T] = Tip(CC(tree, carry))
-              val y : ConQ[T] = Spine(Empty(), $(x))
-              Spine(Empty(), $(y))
-            }*/
         }
     }
-  } ensuring (res => res.isSpine && (res match {
-    case Spine(Empty(), rear) =>
-      rear* match {
-        case Spine(h, _) =>
-          xs* match {
-            case Spine(h, xsrear) =>
-              xsrear* match {
-                case Spine(Empty(), _) =>
-                  (firstUnevaluated(pushUntilZero(rear)) == firstUnevaluated(xs))
-                case _ => true
-              }
+  } ensuring { res =>
+    (res match {
+      case Spine(Empty(), _, rear) =>
+        (!isConcrete(xs) || isConcrete(pushUntilCarry(rear))) &&
+          {
+            val _ = rear.value // this is necessary to assert properties on the state in the recursive invocation (and note this cannot go first)
+            rear.isEvaluated // this is a tautology
           }
-          /*(firstUnevaluated(pushUntilZero(rear)) == firstUnevaluated(xs)) ||
-            (firstUnevaluated(xs)*).isTip*/
-      }
-    case Spine(h, rear) =>
-      firstUnevaluated(rear) == firstUnevaluated(xs)
-    case _ => true
-  }))
+      case _ =>
+        false
+    }) &&
+      time <= 40
+  }
 
-  def pushLeftAndPay[T](ys: Single[T], w: Wrapper[T]): (ConQ[T], Scheds[T]) = {
-    require(w.valid)
+  /**
+   * Lemma:
+   * forall suf. suf*.head != Empty() ^ zeroPredsSuf(xs, suf) ^ concUntil(xs.tail.tail, suf) => concUntil(push(rear), suf)
+   */
+  def pushLeftLazyLemma[T](ys: Conc[T], xs: $[ConQ[T]], suf: $[ConQ[T]]): Boolean = {
+    require(!ys.isEmpty && zeroPreceedsSuf(xs, suf) &&
+      (xs* match {
+        case Spine(h, _, _) => h != Empty[T]()
+        case _              => false
+      }) &&
+      (suf* match {
+        case Spine(Empty(), _, _) =>
+          concreteUntil(xs, suf)
+        case _ => false
+      }))
+    // instantiate the lemma that implies zeroPreceedsLazy
+    (if (zeroPredSufConcreteUntilLemma(xs, suf)) {
+      // property
+      (pushLeftLazy(ys, xs) match {
+        case Spine(Empty(), _, rear) =>
+          concreteUntil(pushUntilCarry(rear), suf)
+      })
+    } else false) &&
+      // induction scheme
+      (xs* match {
+        case Spine(head, _, rear) =>
+          val carry = CC[T](head, ys)
+          rear* match {
+            case s @ Spine(h, _, _) =>
+              if (h != Empty[T]())
+                pushLeftLazyLemma(carry, rear, suf)
+              else true
+            case _ => true
+          }
+      })
+  } holds
+
+  // verifies in 300 secs
+  def pushLeftWrapper[T](ys: Single[T], w: Queue[T]) = {
+    require(w.valid &&
+      // instantiate the lemma that implies zeroPreceedsLazy
+      (w.schedule match {
+        case Cons(h, _) =>
+          zeroPredSufConcreteUntilLemma(w.queue, h)
+        case _ =>
+          concreteZeroPredLemma(w.queue)
+      }))
     val nq = pushLeft(ys, w.queue)
     val nsched = nq match {
-        case Spine(Empty(), rear) =>
-           Cons(rear, w.schedule)
-        //case Spine(_, rear) =>
-        //  firstUnevaluated(rear) == head
-        case _ =>
+      case Spine(Empty(), createdWithSusp, rear) =>
+        if (createdWithSusp == True())
+          Cons[T](rear, w.schedule) // this is the only case where we create a new lazy closure
+        else
           w.schedule
-       }
-    (nq, nsched)
-  } /*ensuring (res => (res._2 match {
-    case Cons(head, tail) =>
-      res._1 match {
-        case Spine(t, rear) =>
-          (head*).isSpine &&
-          firstUnevaluated(rear) == head && {
-            val _ = head.value // evaluate the head and on the evaluated state the following should hold
-              //if (!t.isEmpty)
-                schedulesProperty(head, tail)
-              //else true
+      case _ =>
+        w.schedule
+    }
+    val lq: $[ConQ[T]] = nq
+    (lq, nsched)
+  } ensuring { res =>
+    schedulesProperty(res._1, res._2) &&
+      // lemma instantiations
+      (w.schedule match {
+        case Cons(head, tail) =>
+          w.queue* match {
+            case Spine(h, _, _) =>
+              if (h != Empty[T]())
+                pushLeftLazyLemma(ys, w.queue, head)
+              else true
+            case _ => true
           }
-      }
-    case _ =>
-      res._1 match {
-        case Spine(t, rear) =>
-          isConcrete(rear)
         case _ => true
+      }) &&
+      time <= 80
+  }
+
+  def Pay[T](q: $[ConQ[T]], scheds: Scheds[T]): Scheds[T] = {
+    require(schedulesProperty(q, scheds) && q.isEvaluated)
+    scheds match {
+      case c @ Cons(head, rest) =>
+        head.value match {
+          case Spine(Empty(), createdWithSusp, rear) =>
+            if (createdWithSusp == True())
+              Cons(rear, rest)
+            else
+              rest
+        }
+      case Nil() => scheds
+    }
+  } ensuring { res =>
+    {
+      val in = inState[ConQ[T]]
+      val out = outState[ConQ[T]]
+      schedulesProperty(q, res) &&
+        // instantiations for proving the scheds property
+        (scheds match {
+          case Cons(head, rest) =>
+            concUntilExtenLemma(q, head, in, out) &&
+              (head* match {
+                case Spine(Empty(), _, rear) =>
+                  res match {
+                    case Cons(rhead, rtail) =>
+                      schedMonotone(in, out, rtail, pushUntilCarry(rhead)) &&
+                        concUntilMonotone(rear, rhead, in, out) &&
+                        concUntilCompose(q, rear, rhead)
+                    case _ =>
+                      concreteMonotone(in, out, rear) &&
+                        concUntilConcreteExten(q, rear)
+                  }
+              })
+          case _ => true
+        }) &&
+        // instantiations for zeroPreceedsSuf property
+        (scheds match {
+          case Cons(head, rest) =>
+            concreteUntilIsSuffix(q, head) &&
+              (res match {
+                case Cons(rhead, rtail) =>
+                  concreteUntilIsSuffix(pushUntilCarry(head), rhead) &&
+                    suffixZeroLemma(q, head, rhead) &&
+                    zeroPreceedsSuf(q, rhead)
+                case _ =>
+                  true
+              })
+          case _ =>
+            true
+        })
+    } &&
+      time <= 70
+  }
+
+  // monotonicity lemmas
+  def schedMonotone[T](st1: Set[$[ConQ[T]]], st2: Set[$[ConQ[T]]], scheds: Scheds[T], l: $[ConQ[T]]): Boolean = {
+    require(st1.subsetOf(st2) &&
+      (schedulesProperty(l, scheds) withState st1)) // here the input state is fixed as 'st1'
+    (schedulesProperty(l, scheds) withState st2) && //property
+      //induction scheme
+      (scheds match {
+        case Cons(head, tail) =>
+          head* match {
+            case Spine(_, _, rear) =>
+              concUntilMonotone(l, head, st1, st2) &&
+                schedMonotone(st1, st2, tail, pushUntilCarry(head))
+            case _ => true
+          }
+        case Nil() =>
+          concreteMonotone(st1, st2, l)
+      })
+  } holds
+
+  def concreteMonotone[T](st1: Set[$[ConQ[T]]], st2: Set[$[ConQ[T]]], l: $[ConQ[T]]): Boolean = {
+    require((isConcrete(l) withState st1) && st1.subsetOf(st2))
+    (isConcrete(l) withState st2) && {
+      // induction scheme
+      l* match {
+        case Spine(_, _, tail) =>
+          concreteMonotone[T](st1, st2, tail)
+        case _ =>
+          true
       }
-  }))*/
+    }
+  } holds
 
-  /*def pushLeftAndPay[T](ys: Single[T], w: Wrapper[T]): Wrapper[T] = {
-    require(w.valid)
-    val pl = pushLeft(ys, w.queue)
-    val nq = $(pl) // 'zeroPreceedsLazy' invariant could be temporarily broken
-    // update the schedule
-    // note that only when nq is a spine and the head of nq is empty new lazy closures will be created
-    val tschs = nq.value match {
-      case Spine(Empty(), rear) =>
-        Cons(rear, w.schedule)
-      case _ =>
-        w.schedule
+  def concUntilMonotone[T](q: $[ConQ[T]], suf: $[ConQ[T]], st1: Set[$[ConQ[T]]], st2: Set[$[ConQ[T]]]): Boolean = {
+    require((concreteUntil(q, suf) withState st1) && st1.subsetOf(st2))
+    (concreteUntil(q, suf) withState st2) &&
+      (if (q != suf) {
+        q* match {
+          case Spine(_, _, tail) =>
+            concUntilMonotone(tail, suf, st1, st2)
+          case _ =>
+            true
+        }
+      } else true)
+  } holds
+
+  // suffix predicates and  their properties (this should be generalizable)
+  def suffix[T](q: $[ConQ[T]], suf: $[ConQ[T]]): Boolean = {
+    if (q == suf) true
+    else {
+      q* match {
+        case Spine(_, _, rear) =>
+          suffix(rear, suf)
+        case Tip(_) => false
+      }
+    }
+  }
+
+  // TODO: this should be inferable from the model
+  @axiom
+  def properSuffix[T](l: $[ConQ[T]], suf: $[ConQ[T]]): Boolean = {
+    l* match {
+      case Spine(_, _, rear) =>
+        suffix(rear, suf)
+      case _ => false
     }
-    Wrapper(nq, pay(tschs, nq))
-  }*/ //ensuring (res => res.valid) // && res._2 <= 6)
-
-  /*def pay[T](schs: Scheds[T], xs: $[ConQ[T]]): Scheds[T] = {
-    require(schs match {
-      case Cons(h, t) =>
-        xs.value match {
-          case Spine(Empty(), rear) =>
-            firstUnevaluated(xs) == h
+  } ensuring (res => !res || suf != l)
+
+  def suffixCompose[T](q: $[ConQ[T]], suf1: $[ConQ[T]], suf2: $[ConQ[T]]): Boolean = {
+    require(suffix(q, suf1) && properSuffix(suf1, suf2))
+    properSuffix(q, suf2) &&
+      // induction over suffix(q, suf1)
+      (if (q == suf1) true
+      else {
+        q* match {
+          case Spine(_, _, rear) =>
+            suffixCompose(rear, suf1, suf2)
+          case Tip(_) => false
         }
+      })
+  } holds
+
+  // uncomment this once the model is fixed
+  /*def suffixInEquality[T](l: LazyConQ[T], suf: LazyConQ[T], suf2: ) : Boolean = {
+    require(properSuffix(l, suf))
+    (l != suf) && (
+      evalLazyConQS(l) match {
+        case Spine(_, _, rear) =>
+          suffixInEquality(rear, suf)
+        case _ => false
+      })
+  }.holds*/
+
+  // properties of 'concUntil'
+
+  def concreteUntilIsSuffix[T](l: $[ConQ[T]], suf: $[ConQ[T]]): Boolean = {
+    require(concreteUntil(l, suf))
+    suffix(l, suf) &&
+      // induction scheme
+      (if (l != suf) {
+        (l* match {
+          case Spine(_, cws, tail) =>
+            concreteUntilIsSuffix(tail, suf)
+          case _ =>
+            true
+        })
+      } else true)
+  }.holds
+
+  // properties that extend `concUntil` to larger portions of the queue
+
+  def concUntilExtenLemma[T](q: $[ConQ[T]], suf: $[ConQ[T]], st1: Set[$[ConQ[T]]], st2: Set[$[ConQ[T]]]): Boolean = {
+    require((concreteUntil(q, suf) withState st1) && st2 == st1 ++ Set(suf))
+    (suf* match {
+      case Spine(_, _, rear) =>
+        concreteUntil(q, rear) withState st2
       case _ => true
-    })//weakSchedulesProperty(xs, schs)) // we are not recursive here
-    schs match {
-      case c @ Cons(head, rest) =>
-        head.value match {
-          case Spine(Empty(), rear) =>
-            Cons(rear, rest) // here, new lazy closures are created
+    }) &&
+      // induction scheme
+      (if (q != suf) {
+        q* match {
+          case Spine(_, _, tail) =>
+            concUntilExtenLemma(tail, suf, st1, st2)
           case _ =>
-            rest
+            true
         }
-      case Nil() =>
-        // here every thing is concretized
-        schs
+      } else true)
+  } holds
+
+  def concUntilConcreteExten[T](q: $[ConQ[T]], suf: $[ConQ[T]]): Boolean = {
+    require(concreteUntil(q, suf) && isConcrete(suf))
+    isConcrete(q) &&
+      (if (q != suf) {
+        q* match {
+          case Spine(_, _, tail) =>
+            concUntilConcreteExten(tail, suf)
+          case _ =>
+            true
+        }
+      } else true)
+  } holds
+
+  def concUntilCompose[T](q: $[ConQ[T]], suf1: $[ConQ[T]], suf2: $[ConQ[T]]): Boolean = {
+    require(concreteUntil(q, suf1) && concreteUntil(suf1, suf2))
+    concreteUntil(q, suf2) &&
+      (if (q != suf1) {
+        q* match {
+          case Spine(_, _, tail) =>
+            concUntilCompose(tail, suf1, suf2)
+          case _ =>
+            true
+        }
+      } else true)
+  } holds
+
+  // properties that relate `concUntil`, `concrete`,  `zeroPreceedsSuf` with `zeroPreceedsLazy`
+  //   - these are used in preconditions to derive the `zeroPreceedsLazy` property
+
+  def zeroPredSufConcreteUntilLemma[T](q: $[ConQ[T]], suf: $[ConQ[T]]): Boolean = {
+    require(concreteUntil(q, suf) && zeroPreceedsSuf(q, suf))
+    zeroPreceedsLazy(q) && {
+      // induction scheme
+      if (q != suf) {
+        q* match {
+          case Spine(Empty(), _, _) => true
+          case Spine(_, _, tail) =>
+            zeroPredSufConcreteUntilLemma(tail, suf)
+          case _ =>
+            true
+        }
+      } else true
+    }
+  } holds
+
+  def concreteZeroPredLemma[T](q: $[ConQ[T]]): Boolean = {
+    require(isConcrete(q))
+    zeroPreceedsLazy(q) && {
+      // induction scheme
+      q* match {
+        case Spine(_, _, tail) =>
+          concreteZeroPredLemma(tail)
+        case _ =>
+          true
+      }
+    }
+  } holds
+
+  // properties relating `suffix` an `zeroPreceedsSuf`
+
+  def suffixZeroLemma[T](q: $[ConQ[T]], suf: $[ConQ[T]], suf2: $[ConQ[T]]): Boolean = {
+    require(suf* match {
+      case Spine(Empty(), _, _) =>
+        suffix(q, suf) && properSuffix(suf, suf2)
+      case _ => false
+    })
+    suffixCompose(q, suf, suf2) &&
+      zeroPreceedsSuf(q, suf2) && (
+        // induction scheme
+        if (q != suf) {
+          q* match {
+            case Spine(_, _, tail) =>
+              suffixZeroLemma(tail, suf, suf2)
+            case _ =>
+              true
+          }
+        } else true)
+  }.holds
+
+  /**
+   * Pushing an element to the left of the queue preserves the data-structure invariants
+   */
+  def pushLeftAndPay[T](ys: Single[T], w: Queue[T]) = {
+    require(w.valid)
+
+    val (q, scheds) = pushLeftWrapper(ys, w)
+    val nscheds = Pay(q, scheds)
+    Queue(q, nscheds)
+
+  } ensuring { res => res.valid &&
+    time <= 200
     }
-  }*/ /*ensuring (res => schedulesProperty(res._2, res._1) &&
-    res._3 <= 3)*/
 }
diff --git a/testcases/lazy-datastructures/RealTimeQueue.scala b/testcases/lazy-datastructures/RealTimeQueue.scala
index 64fcd5aa15b996a9b7e3cc0adfd31e6286aafc87..9e4d602f0100dcbcf4151d9b79a855bb7c601c67 100644
--- a/testcases/lazy-datastructures/RealTimeQueue.scala
+++ b/testcases/lazy-datastructures/RealTimeQueue.scala
@@ -8,7 +8,10 @@ import leon.instrumentation._
 //TODO: need to automatically check monotonicity of isConcrete
 object RealTimeQueue {
 
-  sealed abstract class LList[T] {
+  /**
+   * A stream of values of type T
+   */
+  sealed abstract class Stream[T] {
     def isEmpty: Boolean = {
       this match {
         case SNil() => true
@@ -30,35 +33,35 @@ object RealTimeQueue {
       }
     } ensuring (_ >= 0)
   }
-  case class SCons[T](x: T, tail: $[LList[T]]) extends LList[T]
-  case class SNil[T]() extends LList[T]
+  case class SCons[T](x: T, tail: $[Stream[T]]) extends Stream[T]
+  case class SNil[T]() extends Stream[T]
 
-  def ssize[T](l: $[LList[T]]): BigInt = (l*).size
+  def ssize[T](l: $[Stream[T]]): BigInt = (l*).size
 
   //@monotonic
-  def isConcrete[T](l: $[LList[T]]): Boolean = {
-    (l.isEvaluated && (l* match {
+  def isConcrete[T](l: $[Stream[T]]): Boolean = {
+    l.isEvaluated && (l* match {
       case SCons(_, tail) =>
         isConcrete(tail)
       case _ => true
-    })) || (l*).isEmpty
+    })
   }
 
    @invstate
-  def rotate[T](f: $[LList[T]], r: List[T], a: $[LList[T]]): LList[T] = {
-    require(r.size == ssize(f) + 1 && isConcrete(f)) // size invariant between 'f' and 'r' holds
+  def rotate[T](f: $[Stream[T]], r: List[T], a: $[Stream[T]]): Stream[T] = { // doesn't change state
+    require(r.size == ssize(f) + 1 && isConcrete(f))
     (f.value, r) match {
       case (SNil(), Cons(y, _)) => //in this case 'y' is the only element in 'r'
         SCons[T](y, a)
       case (SCons(x, tail), Cons(y, r1)) =>
-        val newa: LList[T] = SCons[T](y, a)
+        val newa: Stream[T] = SCons[T](y, a)
         val rot = $(rotate(tail, r1, newa)) //this creates a lazy rotate operation
         SCons[T](x, rot)
     }
-  } ensuring (res => res.size == (f*).size + r.size + (a*).size && res.isCons && // using f*.size instead of ssize seems to help magically
+  } ensuring (res => res.size == (f*).size + r.size + (a*).size && res.isCons && // using f*.size instead of ssize seems to speed up verification magically
                      time <= 30)
 
-  def firstUnevaluated[T](l: $[LList[T]]): $[LList[T]] = {
+  def firstUnevaluated[T](l: $[Stream[T]]): $[Stream[T]] = {
     if (l.isEvaluated) {
       l* match {
         case SCons(_, tail) =>
@@ -75,34 +78,25 @@ object RealTimeQueue {
       case _ => true
     }))
 
-  def streamScheduleProperty[T](s: $[LList[T]], sch: $[LList[T]]) = {
-    firstUnevaluated(s) == firstUnevaluated(sch)
-  }
-
-  case class Queue[T](f: $[LList[T]], r: List[T], s: $[LList[T]]) {
+  case class Queue[T](f: $[Stream[T]], r: List[T], s: $[Stream[T]]) {
     def isEmpty = (f*).isEmpty
     def valid = {
-      streamScheduleProperty(f, s) &&
+      (firstUnevaluated(f) == firstUnevaluated(s)) &&
         ssize(s) == ssize(f) - r.size //invariant: |s| = |f| - |r|
     }
   }
 
-  def createQueue[T](f: $[LList[T]], r: List[T], sch: $[LList[T]]): Queue[T] = {
-    require(streamScheduleProperty(f, sch) &&
-      ssize(sch) == ssize(f) - r.size + 1) //size invariant holds
-    sch.value match { // evaluate the schedule if it is not evaluated
+  def enqueue[T](x: T, q: Queue[T]): Queue[T] = {
+    require(q.valid)
+    val r = Cons[T](x, q.r)
+    q.s.value match {
       case SCons(_, tail) =>
-        Queue(f, r, tail)
+        Queue(q.f, r, tail)
       case SNil() =>
-        val newa: LList[T] = SNil[T]()
-        val rotres = $(rotate(f, r, newa))
+        val newa: Stream[T] = SNil[T]()
+        val rotres = $(rotate(q.f, r, newa))
         Queue(rotres, Nil[T](), rotres)
     }
-  } ensuring (res => res.valid && time <= 50)
-
-  def enqueue[T](x: T, q: Queue[T]): Queue[T] = {
-    require(q.valid)
-    createQueue(q.f, Cons[T](x, q.r), q.s)
   } ensuring (res => res.valid && time <= 60)
 
   def dequeue[T](q: Queue[T]): Queue[T] = {
@@ -110,10 +104,10 @@ object RealTimeQueue {
     q.f.value match {
       case SCons(x, nf) =>
         q.s.value match {
-          case SCons(_, st) => //here, the precondition of createQueue (reg. suffix property) may get violated, so it is handled specially here.
+          case SCons(_, st) =>
             Queue(nf, q.r, st)
           case _ =>
-            val newa: LList[T] = SNil[T]()
+            val newa: Stream[T] = SNil[T]()
             val rotres = $(rotate(nf, q.r, newa))
             Queue(rotres, Nil[T](), rotres)
         }
diff --git a/testcases/verification/compilation/TinyCertifiedCompiler-Parametric.scala b/testcases/verification/compilation/TinyCertifiedCompiler-Parametric.scala
new file mode 100644
index 0000000000000000000000000000000000000000..7518c21fb8ba7349e49c37c8acac6f99aaae2cdd
--- /dev/null
+++ b/testcases/verification/compilation/TinyCertifiedCompiler-Parametric.scala
@@ -0,0 +1,89 @@
+import leon.lang._
+import leon.lang.Option
+import leon.collection._
+import leon.annotation._
+import leon.proof._
+import leon.lang.synthesis._
+
+object TinyCertifiedCompiler {
+  abstract class ByteCode[A]
+  case class Load[A](c: A) extends ByteCode[A] // loads a constant in to the stack
+  case class OpInst[A]() extends ByteCode[A]
+
+  abstract class ExprTree[A]
+  case class Const[A](c: A) extends ExprTree[A]
+  case class Op[A](e1: ExprTree[A], e2: ExprTree[A]) extends ExprTree[A]
+
+  def compile[A](e: ExprTree[A]): List[ByteCode[A]] = {
+    e match {
+      case Const(c) =>
+        Cons(Load(c), Nil[ByteCode[A]]())
+      case Op(e1, e2) =>
+        (compile(e1) ++ compile(e2)) ++ Cons(OpInst(), Nil[ByteCode[A]]())
+    }
+  }
+
+  def op[A](x: A, y: A): A = {
+    ???[A]
+  }
+
+  def run[A](bytecode: List[ByteCode[A]], S: List[A]): List[A] = {
+    (bytecode, S) match {
+      case (Cons(Load(c), tail), _) =>
+        run(tail, Cons[A](c, S)) // adding elements to the head of the stack
+      case (Cons(OpInst(), tail), Cons(x, Cons(y, rest))) =>
+        run(tail, Cons[A](op(y, x), rest))
+      case (Cons(_, tail), _) =>
+        run(tail, S)
+      case (Nil(), _) => // no bytecode to execute
+        S
+    }
+  }
+
+  def interpret[A](e: ExprTree[A]): A = {
+    e match {
+      case Const(c) => c
+      case Op(e1, e2) =>
+        op(interpret(e1), interpret(e2))
+    }
+  }
+
+  def runAppendLemma[A](l1: List[ByteCode[A]], l2: List[ByteCode[A]], S: List[A]): Boolean = {
+    // lemma
+    (run(l1 ++ l2, S) == run(l2, run(l1, S))) because
+      // induction scheme (induct over l1)
+      (l1 match {
+        case Nil() =>
+          true
+        case Cons(h, tail) =>
+          (h, S) match {
+            case (Load(c), _) =>
+              runAppendLemma(tail, l2, Cons[A](c, S))
+            case (OpInst(), Cons(x, Cons(y, rest))) =>
+              runAppendLemma(tail, l2, Cons[A](op(y, x), rest))
+            case (_, _) =>
+              runAppendLemma(tail, l2, S)
+            case _ =>
+              true
+          }
+      })
+  }.holds
+
+  def compileInterpretEquivalenceLemma[A](e: ExprTree[A], S: List[A]): Boolean = {
+    // lemma
+    (run(compile(e), S) == interpret(e) :: S) because 
+      // induction scheme
+      (e match {
+        case Const(c) =>
+          true
+        case Op(e1, e2) =>
+          // lemma instantiation
+          val c1 = compile(e1)
+          val c2 = compile(e2)
+          runAppendLemma((c1 ++ c2), Cons[ByteCode[A]](OpInst[A](), Nil[ByteCode[A]]()), S) &&
+            runAppendLemma(c1, c2, S) &&
+            compileInterpretEquivalenceLemma(e1, S) &&
+            compileInterpretEquivalenceLemma(e2, Cons[A](interpret(e1), S))
+      })
+  } holds
+}
diff --git a/testcases/verification/compilation/TinyCertifiedCompiler-with-errorOption.scala b/testcases/verification/compilation/TinyCertifiedCompiler-with-errorOption.scala
new file mode 100644
index 0000000000000000000000000000000000000000..912558e6a409a86fa9fe27761c4d265ef586f380
--- /dev/null
+++ b/testcases/verification/compilation/TinyCertifiedCompiler-with-errorOption.scala
@@ -0,0 +1,95 @@
+import leon.lang._
+import leon.lang.Option
+import leon.collection._
+import leon.annotation._
+
+object TinyCertifiedCompiler {
+
+  abstract class ByteCode
+  case class Load(c: BigInt) extends ByteCode // loads a constant in to the stack
+  case class OpInst() extends ByteCode
+
+  abstract class ExprTree
+  case class Const(c: BigInt) extends ExprTree
+  case class OP(e1: ExprTree, e2: ExprTree) extends ExprTree
+
+  def compile(e: ExprTree): List[ByteCode] = {
+    e match {
+      case Const(c) =>
+        Cons(Load(c), Nil[ByteCode]())
+      case OP(e1, e2) =>
+        (compile(e1) ++ compile(e2)) ++ Cons(OpInst(), Nil[ByteCode]())
+    }
+  }
+
+  def op(x: BigInt, y: BigInt) : BigInt = {
+    x + y    
+  }
+
+  def run(bytecode: List[ByteCode], S: Option[List[BigInt]]) : Option[List[BigInt]] = {
+    (bytecode, S) match {
+      case (Cons(Load(c), tail), Some(stack)) =>
+        run(tail, Some(Cons(c, stack))) // adding elements to the head of the stack
+      case (Cons(OpInst(), tail), Some(Cons(x, Cons(y, rest)))) =>
+        run(tail, Some(Cons(op(x, y), rest)))
+      case (Nil(), _) => // no bytecode to execute
+        S
+      case _ =>
+        // here, we have encountered a runtime error, so we return None to signal error
+        None[List[BigInt]]
+    }
+  }
+
+  def interpret(e: ExprTree) : BigInt = {
+    e match {
+      case Const(c) => c
+      case OP(e1, e2) =>
+        op(interpret(e1), interpret(e2))
+    }
+  }
+
+  def runAppendLemma(l1: List[ByteCode], l2: List[ByteCode], S: Option[List[BigInt]]) : Boolean = {
+    // lemma
+    run(l1 ++ l2, S) == run(l2, run(l1, S)) &&
+    // induction scheme (induct over l1)
+    (l1 match {
+      case Nil() =>
+        true
+      case Cons(h, tail) =>
+        (h, S) match {
+          case (Load(c), Some(stk)) =>
+            runAppendLemma(tail, l2, Some(Cons(c, stk)))
+          case (OpInst(), Some(Cons(x, Cons(y, rest)))) =>
+            runAppendLemma(tail, l2, Some(Cons(op(x, y), rest)))
+          case _ =>
+            true
+        }
+    })
+  } holds
+
+  def compileInterpretEquivalenceLemma(e: ExprTree, S: Option[List[BigInt]]) : Boolean = {
+    S match {
+      case None() => true
+      case Some(stk) =>
+        // lemma
+        (run(compile(e), S) match {
+          case None() => true // here, there was a run-time error
+          case Some(rstk) =>
+            rstk == Cons[BigInt](interpret(e), stk)
+        }) &&
+          // induction scheme
+          (e match {
+            case Const(c) =>
+              true
+            case OP(e1, e2) =>
+              // lemma instantiation
+              val c1 = compile(e1)
+              val c2 = compile(e2)
+              runAppendLemma((c1 ++ c2), Cons(OpInst(), Nil[ByteCode]()), S) &&
+                runAppendLemma(c1, c2, S) &&
+              compileInterpretEquivalenceLemma(e1, S) &&
+                compileInterpretEquivalenceLemma(e2, Some(Cons[BigInt](interpret(e1), stk)))
+          })
+    }
+  } holds
+}
diff --git a/testcases/verification/compilation/TinyCertifiedCompiler.scala b/testcases/verification/compilation/TinyCertifiedCompiler.scala
new file mode 100644
index 0000000000000000000000000000000000000000..b7f46a49a27cf4092518afa08f0cef5ff8f724c7
--- /dev/null
+++ b/testcases/verification/compilation/TinyCertifiedCompiler.scala
@@ -0,0 +1,107 @@
+import leon.lang._
+import leon.lang.Option
+import leon.collection._
+import leon.annotation._
+import leon.proof._
+
+object TinyCertifiedCompiler {
+
+  abstract class ByteCode
+  case class Load(c: BigInt) extends ByteCode // loads a constant in to the stack
+  case class OpInst() extends ByteCode
+
+  abstract class ExprTree
+  case class Const(c: BigInt) extends ExprTree
+  case class Op(e1: ExprTree, e2: ExprTree) extends ExprTree
+
+  def compile(e: ExprTree): List[ByteCode] = {
+    e match {
+      case Const(c) =>
+        Cons(Load(c), Nil[ByteCode]())
+      case Op(e1, e2) =>
+        (compile(e1) ++ compile(e2)) ++ Cons(OpInst(), Nil[ByteCode]())
+    }
+  }
+
+  def op(x: BigInt, y: BigInt): BigInt = {
+    x - y
+  }
+
+  def run(bytecode: List[ByteCode], S: List[BigInt]): List[BigInt] = {
+    (bytecode, S) match {
+      case (Cons(Load(c), tail), _) =>
+        run(tail, Cons[BigInt](c, S)) // adding elements to the head of the stack
+      case (Cons(OpInst(), tail), Cons(x, Cons(y, rest))) =>
+        run(tail, Cons[BigInt](op(y, x), rest))
+      case (Cons(_, tail), _) =>
+        run(tail, S)
+      case (Nil(), _) => // no bytecode to execute
+        S
+    }
+  }
+
+  def interpret(e: ExprTree): BigInt = {
+    e match {
+      case Const(c) => c
+      case Op(e1, e2) =>
+        op(interpret(e1), interpret(e2))
+    }
+  }
+
+  def runAppendLemma(l1: List[ByteCode], l2: List[ByteCode], S: List[BigInt]): Boolean = {
+    // lemma
+    (run(l1 ++ l2, S) == run(l2, run(l1, S))) because
+      // induction scheme (induct over l1)
+      (l1 match {
+        case Nil() =>
+          true
+        case Cons(h, tail) =>
+          (h, S) match {
+            case (Load(c), _) =>
+              runAppendLemma(tail, l2, Cons[BigInt](c, S))
+            case (OpInst(), Cons(x, Cons(y, rest))) =>
+              runAppendLemma(tail, l2, Cons[BigInt](op(y, x), rest))
+            case (_, _) =>
+              runAppendLemma(tail, l2, S)
+            case _ =>
+              true
+          }
+      })
+  }.holds
+  
+  def run1(bytecode: List[ByteCode], S: List[BigInt]): List[BigInt] = {
+    (bytecode, S) match {
+      case (Cons(Load(c), tail), _) =>
+        run1(tail, Cons[BigInt](c, S)) // adding elements to the head of the stack
+      case (Cons(OpInst(), tail), Cons(x, Cons(y, rest))) =>
+        run1(tail, Cons[BigInt](op(x, y), rest))
+      case (Cons(_, tail), _) =>
+        run1(tail, S)
+      case (Nil(), _) => // no bytecode to execute
+        S
+    }
+  }
+  
+  def compileInterpretEquivalenceLemma1(e: ExprTree, S: List[BigInt]): Boolean = {
+    // lemma
+    (run1(compile(e), S) == interpret(e) :: S) 
+  } holds
+
+  def compileInterpretEquivalenceLemma(e: ExprTree, S: List[BigInt]): Boolean = {
+    // lemma
+    (run(compile(e), S) == interpret(e) :: S) because 
+      // induction scheme
+      (e match {
+        case Const(c) =>
+          true
+        case Op(e1, e2) =>
+          // lemma instantiation
+          val c1 = compile(e1)
+          val c2 = compile(e2)
+          runAppendLemma((c1 ++ c2), Cons(OpInst(), Nil[ByteCode]()), S) &&
+            runAppendLemma(c1, c2, S) &&
+            compileInterpretEquivalenceLemma(e1, S) &&
+            compileInterpretEquivalenceLemma(e2, Cons[BigInt](interpret(e1), S))
+      })
+  } holds
+}