You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
importscala.language.higherKindsobjectIdentity {
typeId[+A] =A
}
importIdentity._caseclassConfigItem[F[+_], +A](name: String, value: F[A], expected: Option[A])
abstractclassCheck[ItemF[+_], Value] {
typeCItem=ConfigItem[ItemF, Value]
defcheck:List[CItem]
}
finalcaseclassTestCheck() extendsCheck[Id, Int] {
defcheck:List[CItem] =List(ConfigItem[Id, Int]("Test", 1, None))
}
objectMainextendsApp {
deffilter[F[+_], A](in: List[ConfigItem[F, A]]) =
in.filter(!_.expected.isDefined)
/* * Assigning the Check instance to an intermediate variable allows the type inferencer to do its job * These two rows compile fine*/valtestInstance=TestCheck()
valfilteredWorks= filter(testInstance.check)
/* * If on the other hand the instance is created at the same time as we call check() on it, the inferencer fails miserably.*/valfiltered= filter(TestCheck().check)
println(filteredWorks)
}
[info] Compiling1Scala source to /tmp/rendererAhc2oLasji/target/classes...
[error] /tmp/rendererAhc2oLasji/src/main/scala/test.scala:39: no typeparametersfor method filter: (in: List[ConfigItem[F,A]])List[ConfigItem[F,A]] exist so that it can be applied to arguments (List[ConfigItem[[+A]A,Int]])
[error] --- because ---
[error] argument expression'stypeis not compatible with formal parameter type;
[error] found : List[ConfigItem[[+A]A,Int]]
[error] required: List[ConfigItem[?F,?A]]
[error] valfiltered= filter(TestCheck().check)
[error] ^
[error] /tmp/rendererAhc2oLasji/src/main/scala/test.scala:39:typemismatch;
[error] found : List[ConfigItem[[+A(in typeId)]A(in typeId),Int]]
[error] required: List[ConfigItem[F,A(in method filter)]]
[error] valfiltered= filter(TestCheck().check)
[error] ^
[error] two errors found
=====================================
The curious difference seems to boil down to a difference in inferred types for the call to the check method depending on whether or not the receiver has a stable type.
This happens when the type of check is computed. Because it contains types that are prefixed with the this type of TestCheck (e.g. TestCheck.this.HKId, the asSeenFrom operation internally uses an existential type in the second case. It is a bit easier to think about this as though there was a synthetic block of code like:
When the type of this block is computed, we have to hide the local symbol _1. This is done in existentialAbstraction . Since aaf919859f / Scala 2.8, this internally expands type aliases. I don’t know the motivation of that change. But the overall result seems surprising enough to me to be considered a bug.
In your example, a workaround is to explicitly annotate the result of calling check with a type alias that does not include the TestCheck.this type.
I tried changing normalizeAliases to just dealias, rather than normalize. This has the effect of making some tests fail (e.g. test/files/pos/t7517.scala). So we might not have a way to fix this (if, indeed, it is considered a bug!). But I thought I'd lodge the ticket here for the record in any case.
The text was updated successfully, but these errors were encountered:
SethTisue
changed the title
dealising within existentialExtrapolation violates principle of least surprise and degrades type inference
dealiasing within existentialExtrapolation violates principle of least surprise and degrades type inference
Jul 18, 2023
From the mailing list: https://groups.google.com/forum/#!topic/scala-user/6YSTIJuXpKw
=====================================
The curious difference seems to boil down to a difference in inferred types for the call to the check method depending on whether or not the receiver has a stable type.
Warning: Implementation details follow!
This happens when the type of check is computed. Because it contains types that are prefixed with the this type of TestCheck (e.g. TestCheck.this.HKId, the asSeenFrom operation internally uses an existential type in the second case. It is a bit easier to think about this as though there was a synthetic block of code like:
When the type of this block is computed, we have to hide the local symbol _1. This is done in existentialAbstraction . Since aaf919859f / Scala 2.8, this internally expands type aliases. I don’t know the motivation of that change. But the overall result seems surprising enough to me to be considered a bug.
In your example, a workaround is to explicitly annotate the result of calling check with a type alias that does not include the TestCheck.this type.
=====================
I tried changing
normalizeAliases
to just dealias, rather than normalize. This has the effect of making some tests fail (e.g. test/files/pos/t7517.scala). So we might not have a way to fix this (if, indeed, it is considered a bug!). But I thought I'd lodge the ticket here for the record in any case.The text was updated successfully, but these errors were encountered: