You should make foo2()
generic in T
, the tuple type of the first ("a
") elements of the inputs. So if you call
foo2([
['a', `do something here with a`],
['b', `do something here with b`],
['c', `do something here with c`],
]);
then T
should be the tuple type ["a", "b", "c"]
. Here's the first attempt at a solution:
declare function foo2<T extends readonly string[]>(
tuples: readonly [...{ [I in keyof T]:
[a: T[I], b: X<T[I]>]
}]): void;
The tuples
input is essentially of the mapped tuple type {[I in keyof T]: [a: T[I], b: X<T[I]>]
, meaning that for each element of T
at index I
, the value T[I]
will be transformed into [a: T[I], b: X<T[I]>]
. I've got that type wrapped in a variadic tuple type readonly [...⋯]
in order to give the compiler a hint that we want T
to be inferred as a tuple and not an unordered array type.
So if you pass in tuples
as [["x", X<"x">], ["y", X<"y">]]
, then T
should be inferred as ["x", "y"]
.
This mostly works:
foo2([
['a', `do something here with a`], // okay
['b', `do something here with b`], // okay
['c', `do something here with c`], // okay
// ...
['x', "do something here with w"], // no error?!
['y', "do nothing"], // error
['z', "do something here with z"] // okay
])
// foo2<["a", "b", "c", "x" | "w", "y", "z"]>(⋯)
Except that for ['x', "do something here with w"]
, the union type "x" | "w"
was inferred for the corresponding element of T
. This is reasonable behavior, but not what you want. Indeed, you want T[I]
to be inferred only from the first ("a
") elements of the inputs, not from the second ("b
") elements. That means you want the second elements to use T[I]
only for checking, not for inferring.
There's a longstanding open issue at microsoft/TypeScript#14829 to support non-inferential type parameter usages. The idea is that there'd be a NoInfer<T>
utility type so that one could write
declare function foo2<T extends readonly string[]>(
tuples: readonly [...{ [I in keyof T]:
[a: T[I], b: X<NoInfer<T[I]>>]
}]): void;
and the compiler would understand that it should not use that b
element of the tuple to infer T[I]
. There is currently (as of TS5.0) no direct support for this, but there are various techniques available which work. One is mentioned here, where you define NoInfer<T>
as a conditional type in order to defer its evaluation:
type NoInfer<T> = [T][T extends unknown ? 0 : never]
With that definition of NoInfer<T>
, you get the behavior you want:
foo2([
['a', `do something here with a`], // okay
['b', `do something here with b`], // okay
['c', `do something here with c`], // okay
// ...
['x', "do something here with w"], // error!
['y', "do nothing"], // error!
['z', "do something here with z"] // okay
])
Another approach is described here, where you add an additional type parameter constrained to the first one because constraints don't act as inference sites (see microsoft/TypeScript#7234:
declare function foo2<T extends readonly string[], U extends T>(
tuples: readonly [...{ [I in keyof T]:
[a: T[I], b: X<U[I]>]
}]): void;
And this also works:
foo2([
['a', `do something here with a`], // okay
['b', `do something here with b`], // okay
['c', `do something here with c`], // okay
// ...
['x', "do something here with w"], // error!
['y', "do nothing"], // error!
['z', "do something here with z"] // okay
])
It's possible that someday (soon?) there will be an official NoInfer<T>
type and then you can just use it. Until then you can use one of these alternatives.
Playground link to code