Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

High-level explanation of API changes #453

Open
mmkal opened this issue Apr 20, 2020 · 103 comments
Open

High-level explanation of API changes #453

mmkal opened this issue Apr 20, 2020 · 103 comments
Labels
experimental something related to the experimental features

Comments

@mmkal
Copy link
Contributor

mmkal commented Apr 20, 2020

📖 Documentation

Would it be possible to add a paragraph or two to the readme, on the differences between "experimental" (Decoder/Encoder/Codec/Schema) and "stable" (Type), and who should use which? Assuming the concept of Type will be removed in v3, does it mean usage like

import * as t from 'io-ts'

const Person = t.type({ name: t.string })

Person.decode({ name: 'Alice' })

Will need to change? If so, will there be a migration guide (I assume Codec largely replaces Type)? I've seen the S.make(S => ...) syntax around in a few places, but it's not immediately clear if applications relying on io-ts will need to use it, or if it's mainly a low-level construct.

A link to a tracking issue could also work.

@gcanti
Copy link
Owner

gcanti commented Apr 20, 2020

Decoder, Encoder and Codec modules are published as experimental in order to get early feedback from the community.

Codec should largely replace Type except for untagged unions (that's because encoders don't support them).

The Schema module is an advanced feature which must prove its worth, the goal would be to express a generic schema and then derive from that multiple concrete instances (like decoders, encoders, equality instances, arbitraries, etc...).

I think that speaking about a v3 is too early, the new modules / APIs must be validated first.

I'm reserving the 2.2 label to track the issues related to the new modules.

For what concerns the APIs changes, here's a tiny migration guide in order to help experimenting with the Decoder module:

  • Decoder is defined with only one type parameter
  • keyof is replaced by literal
  • record doesn't accept a domain schema anymore
  • brand is replaced by refinement / parse which are not opinionated on how you want to define your branded types
  • recursive renamed to lazy (mutually recursive decoders are supported)
  • intersect is now pipeeable
  • tagged unions must be explicitly defined (with sum) in order to get optimized
  • tuple is not limited to max five components

@gcanti gcanti added the experimental something related to the experimental features label Apr 20, 2020
@mmkal
Copy link
Contributor Author

mmkal commented Apr 20, 2020

Thanks - some follow-up questions:

Codec should largely replace Type except for untagged unions (that's because encoders don't support them).

Will untagged unions be supported at all?

  • Decoder is defined with only one type parameter

Does that mean there's no longer such a thing as a Decoder from I to A - effectively I is always unknown?

  • keyof is replaced by literal

Will this affect the advice to use t.keyof({ a: null, b: null }) to achieve enum-like behaviour (potentially related to the answer to "Will untagged unions be supported at all")?

  • brand is replaced by refinement / parse which are not opinionated on how you want to define your branded types

Does that mean anything for #373? The request there was to keep un-branded refinements as an option.

@steida
Copy link

steida commented Apr 20, 2020

Will this be possible as Codec? I mean, the possibility to decode from two different types.

export const FaunaDocRef = (() => {
  const Serialized = t.type({
    '@ref': t.type({
      id: FaunaID,
      collection: t.type({
        '@ref': t.type({
          id: t.string,
          collection: t.type({
            '@ref': t.type({ id: t.literal('collections') }),
          }),
        }),
      }),
    }),
  });
  const FaunaDocRef = t.type({
    id: FaunaID,
    collection: t.string,
  });
  type FaunaDocRef = t.TypeOf<typeof FaunaDocRef>;
  return new t.Type<FaunaDocRef, values.Ref, unknown>(
    'FaunaDocRef',
    FaunaDocRef.is,
    (u, c) => {
      if (u instanceof values.Ref) {
        return u.collection
          ? t.success({
              id: u.id as FaunaID, // as FaunaID is ok, we don't create ids anyway
              collection: u.collection.id,
            })
          : t.failure(u, c);
      }
      return either.either.chain(Serialized.validate(u, c), (s) =>
        t.success({
          id: s['@ref'].id,
          collection: s['@ref'].collection['@ref'].id,
        }),
      );
    },
    (a) =>
      new values.Ref(
        a.id,
        new values.Ref(a.collection, values.Native.COLLECTIONS),
      ),
  );
})();
export type FaunaDocRef = t.TypeOf<typeof FaunaDocRef>;

@gcanti
Copy link
Owner

gcanti commented Apr 20, 2020

Will untagged unions be supported at all?

They are supported in Decoder. I can't find a reliable way to support them in Encoder though.

So either you define an encoder by hand...

import { left, right } from 'fp-ts/lib/Either'
import * as C from 'io-ts/lib/Codec'
import * as D from 'io-ts/lib/Decoder'
import * as G from 'io-ts/lib/Guard'

const NumberFromString: C.Codec<number> = C.make(
  D.parse(D.string, (s) => {
    const n = parseFloat(s)
    return isNaN(n) ? left(`cannot decode ${JSON.stringify(s)}, should be NumberFromString`) : right(n)
  }),
  { encode: String }
)

export const MyUnion: C.Codec<number | string> = C.make(D.union(NumberFromString, D.string), {
  encode: (a) => (G.string.is(a) ? a : NumberFromString.encode(a))
})

...or you extend Codec to something containing a guard

import * as E from 'io-ts/lib/Encoder'

interface Compat<A> extends D.Decoder<A>, E.Encoder<A>, G.Guard<A> {}

function make<A>(codec: C.Codec<A>, guard: G.Guard<A>): Compat<A> {
  return {
    is: guard.is,
    decode: codec.decode,
    encode: codec.encode
  }
}

function union<A extends ReadonlyArray<unknown>>(
  ...members: { [K in keyof A]: Compat<A[K]> }
): Compat<A[number]> {
  return {
    is: G.guard.union(...members).is,
    decode: D.decoder.union(...members).decode,
    encode: (a) => {
      for (const member of members) {
        if (member.is(a)) {
          return member.encode(a)
        }
      }
    }
  }
}

const string = make(C.string, G.string)

const NumberFromString2 = make(NumberFromString, G.number)

export const MyUnion2: Compat<number | string> = union(NumberFromString2, string)

..or... something else? I don't know, any idea?

Does that mean there's no longer such a thing as a Decoder from I to A - effectively I is always unknown?

Yes it is, but please note that basically I is always unknown in the stable API too.

Will this affect the advice to use t.keyof({ a: null, b: null }) to achieve enum-like behaviour

Not sure what you mean, but the new way is

import * as D from 'io-ts/lib/Decoder'

const MyEnum = D.literal('a', 'b')

and literal is supported by Encoder, so you don't need untagged unions for that.

Does that mean anything for #373? The request there was to keep un-branded refinements as an option

The new refinement function has the following signature

export declare function refinement<A, B extends A>(
  from: Decoder<A>,
  refinement: (a: A) => a is B,
  expected: string
): Decoder<B>

where B is supposed to be different from A, while the old refinement function has the following signature

export declare function refinement<C extends Any>(
  codec: C,
  predicate: Predicate<TypeOf<C>>,
  name?: string
): RefinementC<C>

which I still consider a bad API since the predicate is not carried to the type level.

@steida
Copy link

steida commented Apr 21, 2020

@gcanti Will be JSON type possible?

EVWhV7xUYAAbMgj

@IMax153
Copy link

IMax153 commented Apr 21, 2020

@steida Here is how I solved it using the suggestions from @gcanti above.

import * as C from 'io-ts/lib/Codec';
import * as D from 'io-ts/lib/Decoder';
import * as E from 'io-ts/lib/Encoder';
import * as G from 'io-ts/lib/Guard';

export interface Compat<A> extends D.Decoder<A>, E.Encoder<A>, G.Guard<A> {}

export const makeCompat: <A>(c: C.Codec<A>, g: G.Guard<A>) => Compat<A> = (c, g) => ({
  is: g.is,
  decode: c.decode,
  encode: c.encode,
});

export const lazy = <A>(id: string, f: () => Compat<A>): Compat<A> => {
  return makeCompat(C.lazy(id, f), G.guard.lazy(id, f));
};

export const untaggedUnion: <A extends ReadonlyArray<unknown>>(
  ...ms: { [K in keyof A]: Compat<A[K]> }
) => Compat<A[number]> = (...ms) => ({
  is: G.guard.union(...ms).is,
  decode: D.decoder.union(...ms).decode,
  encode: (a) => ms.find((m) => m.is(a)),
});

type Json = string | number | boolean | null | { [property: string]: Json } | Json[];

const Json: Compat<Json> = lazy<Json>('Json', () =>
  untaggedUnion(
    makeCompat(C.string, G.string),
    makeCompat(C.number, G.number),
    makeCompat(C.boolean, G.boolean),
    makeCompat(C.literal(null), G.literal(null)),
    makeCompat(C.record(Json), G.record(Json)),
    makeCompat(C.array(Json), G.array(Json)),
  ),
);

const json = Json.decode([1, [[['1']], { a: 1, b: false }]]);
console.log(json);
// {
//    _tag: 'Right',
//    right: [ 1, [ [ [ '1' ] ], { a: 1, b: false } ] ]
// }

@gcanti
Copy link
Owner

gcanti commented Apr 21, 2020

@IMax153 @steida the Json encoder is just the identity function

import * as C from 'io-ts/lib/Codec'
import * as D from 'io-ts/lib/Decoder'
import * as E from 'io-ts/lib/Encoder'

type Json = string | number | boolean | null | { [key: string]: Json } | Array<Json>

const JsonDecoder = D.lazy<Json>('Json', () =>
  D.union(C.string, C.number, C.boolean, C.literal(null), C.record(Json), C.array(Json))
)

const Json: C.Codec<Json> = C.make(JsonDecoder, E.id)

@mmkal
Copy link
Contributor Author

mmkal commented Apr 29, 2020

..or... something else? I don't know, any idea?

@gcanti maybe there could be a special case for unions of codecs with encode = t.identity. If t.identity itself were branded, then you could know at the type level and at runtime that it's safe to encode with any of the sub-types. It would cover the "simple" cases where io-ts is just used for validation, which are quite common:

t.union([t.string, t.number, t.type({ myProp: t.boolean })])

☝️ that's assuming it's possible to "propagate" the identity encoder from props, i.e. t.type({ myProp: t.boolean }) satisfies the requirement of all its props being identity-encoders, so it is one too.

note that basically I is always unknown in the stable API too.

Not in the case of typeA.pipe(typeB), but it sounds like that whole flow is going to change somewhat - if so, some kind of migration might be worth noting somewhere in the docs.

@gcanti
Copy link
Owner

gcanti commented May 1, 2020

maybe there could be a special case for unions of codecs with encode = t.identity

@mmkal maybe, but supporting untagged unions means that the implementation of Schemable's union

declare function union<A extends ReadonlyArray<unknown>>(
  ...members: { [K in keyof A]: Encoder<A[K]> }
): Encoder<A[number]>

should work with any Encoder.

Not in the case of typeA.pipe(typeB)

I think that most of the times .pipe(...) can be replaced by parse

declare function parse<A, B>(from: Decoder<A>, parser: (a: A) => Either<string, B>): Decoder<B>

@leemhenson
Copy link
Contributor

I think it would be useful to include a pipe-like fn anyway, for when you already have two decoders that you want to sequence:

const pipe = <A>(da: D.Decoder<A>) => <B>(db: D.Decoder<B>): D.Decoder<B> => ({
  decode: flow(da.decode, E.chain(db.decode)),
});

Or, I dunno, would calling that D.chain make more sense? That's what I instinctively reached for.

@leemhenson
Copy link
Contributor

Another thing I miss from the new API is a name attribute on a Decoder/Encoder. I like to wrap the library's own DecodeError in my own Error subclass so I can attach more metadata, using something like this:

export const decode = <A>(
  decoder: Decoder<A>,
  errorMetadata?: Record<string, unknown>
) =>
  flow(
    decoder.decode,
    E.mapLeft(error => makeIoTsDecodeError(error, errorMetadata))
  );

I used to be able to automatically smoosh in the Type<A, O, I>.name:

makeIoTsDecodeError(error, { decoderName: decoder.name, ...errorMetadata }

But that's not possible with Decoder<A> now.

I suppose I could extend Decoder to add it back in just inside my projects but I wonder what was the reason for dropping name?

@gcanti
Copy link
Owner

gcanti commented May 27, 2020

I think it would be useful to include a pipe-like fn anyway, for when you already have two decoders that you want to sequence

example?

I wonder what was the reason for dropping name?

Because it makes the APIs unnecessarily complicated, IMO the error messages are readable even without the names (and you can always use withExpected if you don't like the default)

@leemhenson
Copy link
Contributor

example?

I know D.parse exists for effectively chaining A => B on the end of an earlier decoder, but you have to return Either<string, B>. If I already have a decoder that provides A => B then it's easiest for me to just compose them together. I suppose there's some tension there because decode is always taking u: unknown as opposed to a hard requirement on A, which D.parse does give you. 🤷

In some ways this feels similar to the discussion we had recently about re-adding the second type parameter to Encoder. It almost feels to me like we should have Decoder<Output, Input = unknown>, and D.chain(dua: Decoder<A>, dab: Decoder<B, A>) => Decoder<B>.

withExpected

Yeah I think i need to experiment with that a bit - I guess I would use it to replace the text inside a leaf?

@mmkal
Copy link
Contributor Author

mmkal commented Jun 15, 2020

@gcanti re D.parse vs the old Type.prototype.pipe. What would you recommend for a base64-json decoder. i.e. one that decodes base 64, parses the decoded string as JSON, then validates the json using io-ts. An example use case is handling kinesis events, which trigger lambdas with base64 payloads. With the current io-ts, it's possible to use a combinator like:

const MyEvent = kinesisEvent(t.type({ foo: t.string }))

The kinesisEvent combinator can ensure:

  • the input value looks like { records: Array<{ recordId: string; data: string }> }
  • each records[*].data is a string
  • each string is valid base 64
  • when decoded, the strings are valid json
  • when json-parsed, the decoded strings have structure { foo: string }

Is this possible with the new D.parse? Would you recommend @leemhenson's method - if so it would be great if there were a first-class helper for it to avoid many implementations in downstream projects that might miss edge cases.


Another question, since this issue title is "High-level explanation of API changes". Could you give a recommendation for users who rely on myType.props for interface and partial types, and myType.types for union and intersection types in the stable API? From comments like this it sounds like there isn't a replacement yet, for reflection-like functionality. Use cases include UI-generation, dynamic codec manipulation, etc. In this comment you mentioned development of the stable API is frozen. Does this mean the functionality is going to be replaced by something else?

@gcanti
Copy link
Owner

gcanti commented Jun 15, 2020

Is this possible with the new D.parse?

Yes, but there are many different ways to get the final result so I guess it really depends on your coding style ("everything is a decoder" or "I want to lift by business / parsing logic only once"?).

For example

import * as E from 'fp-ts/lib/Either'
import { Json } from 'io-ts/lib/JsonEncoder'
import * as D from 'io-ts/lib/Decoder'
import { flow } from 'fp-ts/lib/function'

// > decodes base 64
declare function decodeBase64(s: string): E.Either<string, string>

// > parses the decoded string as JSON
declare function parseJSON(s: string): E.Either<string, Json>

// > then validates the json using io-ts
declare function decodeItem(json: Json): E.Either<string, { foo: string }>

const parser = flow(decodeBase64, E.chain(parseJSON), E.chain(decodeItem))

export const X = D.parse(D.string, parser)

it sounds like there isn't a replacement yet

Actually one of the goals of my rewrite was to get rid of those meta infos (at the type level)

@gcanti
Copy link
Owner

gcanti commented Jun 23, 2020

@leemhenson @mmkal given the good results in #478 I'm going to make the following breaking changes:

  • Decoder
    • change DecoderError
    • remove never
    • make parse pipeable and change its parser argument
  • Guard
    • remove never
  • Schemable
    • make intersections pipeables
    • make refinements pipeables

parse

from

declare export function parse<A, B>(from: Decoder<A>, parser: (a: A) => Either<string, B>): Decoder<B>

to

declare export function parse<A, B>(parser: (a: A) => E.Either<DecodeError, B>): (from: Decoder<A>) => Decoder<B>

Pros:

  • more general (DecodeError instead of string)
  • pipeable
  • should accomodate the pipe use case
import { pipe } from 'fp-ts/lib/pipeable'
import * as D from '../src/Decoder2'
import { Json } from '../src/JsonEncoder'

// > decodes base 64
declare const Base64: D.Decoder<string>

// > parses the decoded string as JSON
declare const Json: D.Decoder<Json>

// > then validates the json using io-ts
declare const Item: D.Decoder<{ foo: string }>

export const X = pipe(D.string, D.parse(Base64.decode), D.parse(Json.decode), D.parse(Item.decode))

@leemhenson
Copy link
Contributor

Are you changing the signatue of

export interface Decoder<A> {
  readonly decode: (u: unknown) => Either<DecodeError, A>
}

to:

export interface Decoder<A, B = unknown> {
  readonly decode: (u: B) => Either<DecodeError, A>
}

?

Otherwise Base64 and Json are both going to have to repetitively test inside their decode implementations whether u is actually a string before performing string-based operations. Micro-optimizations maybe, but as you combine more and more parsers together to operate on larger and larger structures, it could start to add up.

@gcanti
Copy link
Owner

gcanti commented Jun 23, 2020

@leemhenson isn't what happens in your proposal too?

const pipe = <A>(da: D.Decoder<A>) => <B>(db: D.Decoder<B>): D.Decoder<B> => ({
  decode: flow(da.decode, E.chain(db.decode)),
});

@leemhenson
Copy link
Contributor

Yes, I never said mine was optimal 😅 . I'm just re-raising the point I made earlier:

In some ways this feels similar to the discussion we had recently about re-adding the second type parameter to Encoder. It almost feels to me like we should have Decoder<Output, Input = unknown>, and D.chain(dua: Decoder, dab: Decoder<B, A>) => Decoder.

If we did that, then the piped decoders wouldn't need to keep checking the same things over and over.

@leemhenson
Copy link
Contributor

Why is that in bold? Emphasis not mine! 😬

@gcanti
Copy link
Owner

gcanti commented Jun 23, 2020

@leemhenson maybe it's just a bias of mine but I consider a "proper decoder" an arrow that goes from unknown to some type A

unknown -> M<A>

for some effect M, because otherwise it's just a "normal" kleisli arrow

A -> M<B>

and I use chain to compose those.

So personally I would model my pipeline starting from normal kleisli arrows and then I would define a suitable decoder based on the use case at hand.

// kleisli arrows in my domain

declare function decodeBase64(s: string): E.Either<string, string>

declare function parseJSON(s: string): E.Either<string, Json>

declare function decodeItem(json: Json): E.Either<string, { foo: string }>

// I can compose them as usual using the `Monad` instance of `Either`

const decode = flow(decodeBase64, E.chain(parseJSON), E.chain(decodeItem))

// and then define my decoder

export const MyDecoder = pipe(
  D.string,
  D.parse((s) =>
    pipe(
      decode(s),
      E.mapLeft((e) => D.error(s, e))
    )
  )
)

Alternatively I could have already defined some decoders

declare const Base64: D.Decoder<string>

declare const Json: D.Decoder<Json>

declare const Item: D.Decoder<{ foo: string }>

if this is the case, again I can compose them via parse

// and I can compose them via parse

export const MyDecoder2 = pipe(Base64, D.parse(Json.decode), D.parse(Item.decode))

// or even this if I want micro optimizations
export const MyDecoder3 = pipe(
  Base64,
  D.parse((s) =>
    pipe(
      parseJSON(s),
      E.mapLeft((e) => D.error(s, e))
    )
  ),
  D.parse(Item.decode)
)

Do we really need something more? Genuine question, I'm open to suggestions if you think there's an ergonomic issue with the current APIs.

In the end if you can define a

export interface Decoder<A, B> {
  readonly decode: (a: A) => Either<DecodeError, B>
}

then you can just define f = (a: A) => Either<DecodeError, B> and use parse to compose f with a Decoder<A>

@leemhenson
Copy link
Contributor

It's not a major issue, just a niggle I keep encountering because I have scenarios like these:

  • desire to decode from some wire representation into a branded type, for simplicity let's say Int
  • wire types might be numeric or a string representation of a number, e.g. 3 or "3"
  • an Int might also be further branded into FooId or Cents or something, and that logic might include doing some bounds checking

So in this case I would like to have primitive decoders:

intFromNumber: Decoder<Int, number>
intFromString: Decoder<Int, string>
fooIdFromInt: Decoder<FooId, Int>
centsFromInt: Decoder<Cents, Int>

then I can compose them together to make complex ones:

pipe(
  stringFromUnknown,
  intFromString,
  fooIdFromInt,
) // => Decoder<FooId, unknown>

I could wrap all that logic up using Decoder<FooId> as it is today but if I have useful chains of decoders that I want to reuse it always ends up feeling like it would be more elegant to make the composition at the Decoder level.

But, hey, that's just me. I might just be using the wrong tool for the job. 🤷

@gcanti
Copy link
Owner

gcanti commented Jun 24, 2020

it always ends up feeling like it would be more elegant to make the composition at the Decoder level

Well, while I love io-ts, as a user I would try (as far as possible) to not leak an implementation detail (i.e. which library I'm using to validate / decode at the border). In my app domain I would prefer to define

intFromNumber: number -> Either<MyDomainError, Int>
intFromString: string -> Either<MyDomainError, Int>
fooIdFromInt: Int -> Either<MyDomainError, FooId>
centsFromInt: Int -> Either<MyDomainError, Cents>

that will be future-proof even if a I replace io-ts with another solution.

In elm-ts I removed the hard dependency on io-ts by just requiring a kleisli arrow.
I will do the same for fp-ts-routing in the next major release.

But that's just a point of view, yours is sensible too.

Let me just think more about all of this...

@leemhenson
Copy link
Contributor

... as a user I would try (as far as possible) to not leak an implementation detail (i.e. which library I'm using to validate / decode at the border). In my app domain I would prefer to define ...

Yeah I'm only talking about composition of Decoders as a means to construct the larger Decoders that I use at the app boundary to convert unknown => Either<DecodeError, SomeComplexNestedProduct>. I don't see the Decoder in the rest of the codebase.

@gcanti
Copy link
Owner

gcanti commented Jun 24, 2020

In the end if you can define a

export interface Decoder<A, B> {
readonly decode: (a: A) => Either<DecodeError, B>
}

then you can just define f = (a: A) => Either<DecodeError, B>

However the converse is also true, so my POV is actually biased and without noticeable substance, @leemhenson I'll reconsider my position.

@gcanti
Copy link
Owner

gcanti commented Jun 25, 2020

@leemhenson while experimenting with kleisli arrows looks like I found something more general than DecoderT (see the Kleisli module):

interface Kleisli<M extends URIS2, I, E, A> {
  readonly decode: (i: I) => Kind2<M, E, A>
}

for which I can define a compose operation

const compose = <M extends URIS2, E>(M: Monad2C<M, E>) => <A, B>(ab: Kleisli<M, A, E, B>) => <I>(
  ia: Kleisli<M, I, E, A>
): Kleisli<M, I, E, B> => ({
  decode: (i) => M.chain(ia.decode(i), ab.decode)
})

Then from Kleisli I can derive KleisliDecoder

interface KleisliDecoder<I, A> extends K.Kleisli<E.URI, I, DecodeError, A> {}`

and from KleisliDecoder I can derive our old Decoder

interface Decoder<A> extends KD.KleisliDecoder<unknown, A> {}

Example

import { pipe } from 'fp-ts/lib/pipeable'
import * as D from '../src/Decoder'
import * as KD from '../src/KleisliDecoder'

interface IntBrand {
  readonly Int: unique symbol
}
type Int = number & IntBrand
interface CentsBrand {
  readonly Cents: unique symbol
}
type Cents = number & CentsBrand

declare const IntFromString: KD.KleisliDecoder<string, Int>
declare const CentsFromInt: KD.KleisliDecoder<Int, Cents>

// const result: D.Decoder<Cents>
export const result = pipe(
  D.string, 
  D.compose(IntFromString), 
  D.compose(CentsFromInt)
)

@leemhenson
Copy link
Contributor

Love it.

fonzie

@gcanti
Copy link
Owner

gcanti commented Jun 26, 2020

KleisliDecoder is quite interesting in that its input type is fine grained (i.e. not a generic Record<string, I>) and depends on the fields passed in

/*
const kdecoder: KD.KleisliDecoder<{
    name: unknown;
    age: string;
    cents: Int;
}, {
    name: string;
    age: Int;
    cents: Cents;
}>
*/
export const kdecoder = KD.type({
  name: D.string,
  age: IntFromString,
  cents: CentsFromInt
})

EDIT: same for tuple, etc...

// const kdecoder2: KD.KleisliDecoder<[unknown, string, Int], [string, Int, Cents]>
export const kdecoder2 = KD.tuple(D.string, IntFromString, CentsFromInt)

@gcanti
Copy link
Owner

gcanti commented Jun 29, 2020

@DylanRJohnston
Copy link

I noticed there's a schemable instance for the old io-ts type representation but it drops a lot of the structure from the type.

import { interpreter, make } from 'io-ts/Schema';
import { Schemable } from 'io-ts/Type'

const Foo = make(S =>
  S.struct({
    foo: S.string,
    bar: S.number,
  }),
);

// Type<{foo: string; bar: number;}>, instead of TypeC<{ foo: StringC, bar: NumberC }>
const foo = interpreter(Schemable)(Foo);

@gcanti
Copy link
Owner

gcanti commented Apr 24, 2021

Allowing encoding (serialisation) to return either an error or a successfully encoded value

@treybrisbane that's an interesting approach and I agree that there are good use cases for it.

What's the difference between a decoder and an encoder though?

type Decoder<I, E, A> = (i: I) => Either<E, A>
type Encoder<I, A> = (i: I) => A

Precisely the fact that encoders can't fail.

So if we allow encoders to possibly fail then I think we can just unify decoders and encoders: they are just all decoders.

import { Either, right } from 'fp-ts/lib/Either'

export type Decoder<I, E, A> = (i: I) => Either<E, A>

export type DecodeError = string

// which one is a decoder"? or an "encoder"? it doesn't matter anymore

export const NumberFromString: Decoder<string, DecodeError, number> = (s) =>
  right(parseFloat(s))

export const StringFromNumber: Decoder<number, DecodeError, string> = (n) =>
  right(String(n))

@gcanti
Copy link
Owner

gcanti commented Apr 26, 2021

At first it sounds a bit naïve, but it makes sense: In order to know which union constituent to encode with, you need to check the value.

@treybrisbane I'm not sure I understand how it works, let's say we unify decoders and encoders and we have the following:

import { pipe } from 'fp-ts/function'
import * as D from 'io-ts/Decoder'

// const trim: D.Decoder<string, string>
export const trim = pipe(
  D.id<string>(),
  D.map((s) => s.trim())
)
// const double: D.Decoder<number, number>
export const double = pipe(
  D.id<number>(),
  D.map((n) => n * 2)
)

how would you define a decoder: Decoder<string | number, string | number>?

@treybrisbane
Copy link

treybrisbane commented Apr 27, 2021

@gcanti I'm assuming D.id in your example is just the identity D.Decoder?

If so, you'd need to replace D.id<string>() with D.string, and D.id<number>() with D.number. This is because you need trim to fail if it's passed a non-string, and double to fail if it's passed a non-number.
You can then implement a union decoder sorta like this:

import { pipe } from 'fp-ts/function';
import * as E from 'fp-ts/Either';
import * as D from 'io-ts/Decoder';

const union = <I1, E1, A1, I2, E2, A2>(
  decoder1: D.Decoder<I1, E1, A1>,
  decoder2: D.Decoder<I2, E2, A2>,
): D.Decoder<I1 | I2, E1 | E2, A1 | A2> =>
  pipe(
    decoder1,
    E.orElse(decoder2),
  );

const trimOrDouble = union(trim, double);

(I'm rushing this so I've probably made a mistake, but the idea is that you run the first decoder, return the result if it's successful, otherwise run the second decoder, return the result if it's successful, etc.)

@steida
Copy link

steida commented May 13, 2021

I did not find an example for a generic Decoder, so I experimented. Maybe this will be useful for someone.

const endpoint = <
  A extends Record<string, unknown>,
  R extends Record<string, unknown>,
>({
  args = {} as { [K in keyof A]: D.Decoder<unknown, A[K]> },
  result = {} as { [K in keyof R]: D.Decoder<unknown, R[K]> },
}: {
  args?: { [K in keyof A]: D.Decoder<unknown, A[K]> };
  result?: { [K in keyof R]: D.Decoder<unknown, R[K]> };
}) => ({
  args: D.struct(args),
  result: D.struct(result),
});

const api = {
  randomThing: endpoint({
    result: { thing: Thing },
  }),
  thingById: endpoint({
    args: { id: D.string },
    result: { thing2: Thing },
  }),
  fooThing: endpoint({
    args: { id: D.string },
  }),
};

@steida
Copy link

steida commented May 16, 2021

@gcanti Is it possible to define or enforce an empty struct? The use case is an endpoint with args or result as an empty object.

const EmptyStruct = C.struct({});
// This should be an error, IMHO.
EmptyStruct.encode({a: 1})

@steida
Copy link

steida commented May 16, 2021

BTW, is used branding recommended? It's weird branded props are visible on primitives.

Screenshot 2021-05-16 at 18 21 23

@gcanti
Copy link
Owner

gcanti commented May 17, 2021

Is it possible to define or enforce an empty struct?

@steida I would define a custom decoder

@gcanti
Copy link
Owner

gcanti commented May 17, 2021

Hi all, I've been working on a new version of the Decoder experimental module for a few months and I would like to show you my findings.

This time I started from a bunch of use cases and long standing issues.

The first two biggest changes are:

  • the error model
  • These instead of Either as decoding result

therefore the Decoder signature has changed from:

export interface Decoder<I, A> {
  readonly decode: (i: I) => Either<DecodeError, A>
}

to:

export interface Decoder<I, E, A> {
  readonly decode: (i: I) => These<E, A>
}

which should unlock the possibility to:

There are many other use cases I'm trying to solve but I want to start from this list to get early feedback from you all.

You can find the source code in the poc branch: the poc.ts file contains all the relevant code so you can easily copy / paste and start playing.

customization of error types

Let's say we want to check the minimum length of a string:

// the model of the custom error
export interface MinLengthE<N extends number> {
  readonly _tag: 'MinLengthE'
  readonly minLength: N
  readonly actual: string
}

all custom errors must be wrapped in a LeafE error (a technical requirement):

export interface MinLengthLE<N extends number> extends LeafE<MinLengthE<N>> {}

// constructor
export const minLengthLE = <N extends number>(minLength: N, actual: string): MinLengthLE<N> =>
  leafE({ _tag: 'MinLengthE', minLength, actual })

now I can define my custom combinator:

export const minLength = <N extends number>(minLength: N): Decoder<string, MinLengthLE<N>, string> => ({
  decode: (s) => (s.length >= minLength ? success(s) : failure(minLengthLE(minLength, s)))
})

const string3 = minLength(3)
assert.deepStrictEqual(string3.decode('abc'), success('abc'))
assert.deepStrictEqual(string3.decode('a'), failure(minLengthLE(3, 'a')))

make io-ts more suitable for form decoding

Let's use string3 in a fromStruct:

export const PersonForm = fromStruct({
  name: string3,
  age: number
})
/*
const PersonForm: FromStructD<{
    name: Decoder<string, MinLengthLE<3>, string>;
    age: numberUD;
}>
*/

The decoding error is fully typed, this means that you can pattern match on the error:

export const formatPersonFormE = (de: ErrorOf<typeof PersonForm>): string =>
  de.errors
    .map((e): string => {
      switch (e.key) {
        case 'name':
          //     this is of type `MinLengthE<3>` ---v
          return `invalid name, must be ${e.error.error.minLength} or more characters long`
        case 'age':
          return 'invalid age'
      }
    })
    .join(', ')

assert.deepStrictEqual(
  pipe(PersonForm.decode({ name: 'name', age: 18 }), TH.mapLeft(formatPersonFormE)),
  success({ name: 'name', age: 18 })
)
assert.deepStrictEqual(
  pipe(PersonForm.decode({ name: '', age: 18 }), TH.mapLeft(formatPersonFormE)),
  failure('invalid name, must be 3 or more characters long')
)
assert.deepStrictEqual(
  pipe(PersonForm.decode({ name: '', age: null }), TH.mapLeft(formatPersonFormE)),
  failure('invalid name, must be 3 or more characters long, invalid age')
)

return warnings other than errors

The number decoder can raise a NumberE error but also two warnings:

  • NaNE
  • InfinityE
export const formatNumberE = (de: ErrorOf<typeof number>): string => {
  switch (de.error._tag) {
    case 'NumberE':
      return 'the input is not even a number'
    case 'NaNE':
      return 'the input is NaN'
    case 'InfinityE':
      return 'the input is Infinity'
  }
}

assert.deepStrictEqual(pipe(number.decode(1), TH.mapLeft(formatNumberE)), success(1))
assert.deepStrictEqual(pipe(number.decode(null), TH.mapLeft(formatNumberE)), failure('the input is not even a number'))
assert.deepStrictEqual(pipe(number.decode(NaN), TH.mapLeft(formatNumberE)), warning('the input is NaN', NaN))

optionally fail on additional properties

Additional properties are still stripped out, but they are also reported as warnings:

export const A = struct({ a: string })

assert.deepStrictEqual(
  //                                   v-- this utility transforms a decoding error into a tree
  pipe(A.decode({ a: 'a', c: true }), draw),
  warning('1 error(s) found while checking keys\n└─ unexpected key "c"', { a: 'a' })
  // warning ---^                                                             ^-- stripped out result
)

Then you can choose to "absolve" the Both result to a Right or "condemn" to a Left.

Since additional properties are reported as warnings rather than errors, this mechanism play well with intersections too.
I wrote an algorithm based on the following statement: a property is considered additional if is additional for each member of the intersection.

export const B = struct({ b: number })
export const AB = pipe(A, intersect(B))

assert.deepStrictEqual(
  pipe(AB.decode({ a: 'a', b: 1, c: true }), draw),
  warning(
    `2 error(s) found while decoding (intersection)
├─ 1 error(s) found while decoding member 0
│  └─ 1 error(s) found while checking keys
│     └─ unexpected key "c"
└─ 1 error(s) found while decoding member 1
   └─ 1 error(s) found while checking keys
      └─ unexpected key "c"`,
    { a: 'a', b: 1 }
  )
)

^ here only the "c" property is reported as additional

@gcanti
Copy link
Owner

gcanti commented May 18, 2021

all custom errors must be wrapped in a LeafE error (a technical requirement)

Why is that? Because the error type in Decoder is now generic (E), which means that on one hand we get a fully typed error (see the comment above about form handling) but on the other hand it poses a new problem: how to handle errors in a generic way? For example, how to define a toTree utility that transforms any decoding error into a Tree<string>?

A possible solution is to define a sum type representing all errors:

// I can pattern match on `DecodeError` while retaining the possibility to define custom errors
export type DecodeError<E> =
  | UnexpectedKeysE
  | MissingKeysE
  | UnexpectedIndexesE
  | MissingIndexesE
  | LeafE<E> // <= leaf error
  | NullableE<E>
  | etc...

where LeafE represents a "leaf error" and can contain custom errors too.

When I try to define a toTree function:

declare const toTree: <E>(de: DecodeError<E>) => Tree<string>

I also need a way to transform a generic E to Tree<string> so I define toTreeWith instead:

export declare const toTreeWith: <E>(toTree: (e: E) => Tree<string>) => (de: DecodeError<E>) => Tree<string>

Now I can define toTree as

const toTree = toTreeWith(toTreeBuiltin)

where toTreeBuiltin: (de: BuiltinE) => Tree<string> is an helper able to serialize the built-in errors.

But what if the decoding error contains a custom error? Fortunately the whole mechanism is type safe and I get a typescript error:

pipe(PersonForm.decode({ name: '', age: 18 }), TH.mapLeft(toTree)) // Type 'MinLengthLE<3>' is not assignable to type 'LeafE<BuiltinE>'

As a fix I can define my custom toTree function

//                                    my custom error --v
export const myToTree = toTreeWith((e: BuiltinE | MinLengthE<number>) => {
  switch (e._tag) {
    case 'MinLengthE':
      return tree(`cannot decode ${format(e.actual)}, must be ${e.minLength} or more characters long`)
    default:
      return toTreeBuiltin(e)
  }
})

assert.deepStrictEqual(
  pipe(PersonForm.decode({ name: '', age: 18 }), TH.mapLeft(myToTree)),
  failure(
    tree('1 error(s) found while decoding (struct)', [
      tree('1 error(s) found while decoding required key "name"', [
        tree('cannot decode "", must be 3 or more characters long')
      ])
    ])
  )
)

A notable decoding error is MessageE which allows to express a custom error message:

use case: old decoder, custom error message

// throw away utiliy for this issue
export const toString = flow(draw, print)

export const mystring = pipe(
  string,
  mapLeft(() => message(`please insert a string`))
)

assert.deepStrictEqual(
  pipe(string.decode(null), toString),
  `Errors:
cannot decode null, expected a string` // <= default message
)
assert.deepStrictEqual(
  pipe(mystring.decode(null), toString),
  `Errors:
please insert a string` // <= custom message
)

use case: new decoder, custom error message

The message constructor can be used with new decoders too:

export const date: Decoder<unknown, MessageLE, Date> = {
  decode: (u) => (u instanceof Date ? success(u) : failure(message('not a Date')))
}

assert.deepStrictEqual(
  pipe(date.decode(null), toString),
  `Errors:
not a Date`
)

use case: new decoder, multiple custom messages (#487)

export interface UsernameBrand {
  readonly Username: unique symbol
}

export type Username = string & UsernameBrand

const USERNAME_REGEX = /(a|b)*d/

export const Username = pipe(
  mystring,
  compose({
    decode: (s) =>
      s.length < 2
        ? failure(message('too short'))
        : s.length > 4
        ? failure(message('too long'))
        : USERNAME_REGEX.test(s)
        ? failure(message('bad characters'))
        : success(s as Username)
  })
)

assert.deepStrictEqual(
  pipe(tuple(Username, Username, Username, Username, Username).decode([null, 'a', 'bbbbb', 'abd', 'ok']), toString),
  `Errors:
4 error(s) found while decoding (tuple)
├─ 1 error(s) found while decoding required component 0
│  └─ please insert a string
├─ 1 error(s) found while decoding required component 1
│  └─ too short
├─ 1 error(s) found while decoding required component 2
│  └─ too long
└─ 1 error(s) found while decoding required component 3
   └─ bad characters`
)

@steida
Copy link

steida commented Jun 1, 2021

It seems C.sum is not type-safe, while D.sum is.

I can mistype 'type' or 'A', and TS is silent.

const MySum = C.sum('type')({
  A: C.struct({ type: C.literal('A'), a: C.string }),
  B: C.struct({ type: C.literal('B'), b: C.number }),
});

@safareli
Copy link
Contributor

safareli commented Jun 7, 2021

@gcanti the poc looks good! one thing I would like to suggest is to either add an error node which will decorate subtree with a string (so that withMessage can be implemented). currently when users are using there own custom error this error nodes are at the leaf level and can't be used to decorate intersection of couple D.struct for example.

@steida
Copy link

steida commented Jun 14, 2021

@gcanti I just tried poc and it's awesome. I'm looking forward to the release.

@safareli
Copy link
Contributor

link to the poc for future reader:
https://github.com/gcanti/io-ts/blob/poc/src/poc.ts

@fernandomrtnz
Copy link

fernandomrtnz commented Jan 28, 2022

@gcanti Wanted to report a use case for consideration when developing the experimental interfaces, and to see if there's a workaround we're missing.

We use the Schema interface to describe the API payloads we receive over the wire. We then derive Decoders from the schemas and map the raw API payload into a shape that better fits our domain model used in business logic. We have a need for guards that correspond to the domain model type instead of the raw API schemas. We would like to avoid having to keep in sync a Decoder and Guard declaration for the same domain model type.

Is it possible to derive a Guard from the mapped result of a Decoder?


Our best attempt so far using a Schema, but it can be adapted to work from just a Decoder.

const [DomainModelDecoder, DomainModelGuard] = pipe(
	iotsDecoderInterpreter(ApiSchema), // Decoder derived from schema.
	(apiOutputDecoder) => {
		const decoderWithDomainTransformation = pipe(
			apiOutputDecoder,
			iotsDecoder.map((resourceApiOutput) => /* Domain transformation logic. */)
		)

		const guardFromDecoder = (
			decoder: typeof decoderWithDomainTransformation
		) => (
			value: unknown
		): value is iotsDecoder.TypeOf<typeof decoderWithDomainTransformation> =>
			isRight(decoder.decode(value))

		return [
			decoderWithDomainTransformation,
			guardFromDecoder(decoderWithDomainTransformation),
		]
	}
)

@florianbepunkt
Copy link

Is it still possible to create a untagged union with the help of a guard? Can't figure out what the new signature of union should look like with the current Decoder module (not the POC)

interface Compat<A> extends D.Decoder<A>, E.Encoder<A>, G.Guard<A> {}

function make<A>(codec: C.Codec<A>, guard: G.Guard<A>): Compat<A> {
  return {
    is: guard.is,
    decode: codec.decode,
    encode: codec.encode
  }
}

function union<A extends ReadonlyArray<unknown>>(
  ...members: { [K in keyof A]: Compat<A[K]> }
): Compat<A[number]> {
  return {
    is: G.guard.union(...members).is,
    decode: D.decoder.union(...members).decode,
    encode: (a) => {
      for (const member of members) {
        if (member.is(a)) {
          return member.encode(a)
        }
      }
    }
  }
}

@jamiehodge
Copy link

@gcanti thank for this new API. We like it very much. 1. Do you have plans to graduate it to stable and 2. When you do, would you consider adding top-level codec, decoder etc. imports to match the shape of fp-ts?

@kalda341
Copy link

kalda341 commented Mar 7, 2022

@florianbepunkt I have done basically this with the POC if you're interested:

Schemable:

import * as IoTsSchemable from 'io-ts/lib/Schemable';
import { hkt as HKT, option as O } from 'fp-ts';

export interface Schemable<S> extends IoTsSchemable.Schemable<S> {
  readonly union: <A, K extends string>(memberMap: {
    [R in K]: HKT.HKT<S, A>;
  }) => (f: (a: unknown) => O.Option<K>) => HKT.HKT<S, A>;
}
export interface Schemable1<S extends HKT.URIS>
  extends IoTsSchemable.Schemable1<S> {
  readonly union: <A, K extends string>(memberMap: {
    [R in K]: HKT.Kind<S, A>;
  }) => (f: (a: unknown) => O.Option<K>) => HKT.Kind<S, A>;
}
export interface Schemable2C<S extends HKT.URIS2, E> {
  readonly union: <A, K extends string>(memberMap: {
    [R in K]: HKT.Kind2<S, E, A>;
  }) => (f: (a: unknown) => O.Option<K>) => HKT.Kind2<S, E, A>;
}

Schema:

import { function as F } from 'fp-ts';
import * as IoTsSchemable from 'io-ts/lib/Schemable';
import * as IoTsSchema from 'io-ts/lib/Schema';
import * as Schemable from './Schemable';
import { hkt as HKT } from 'fp-ts';

export interface Schema<A> {
  <S>(S: Schemable.Schemable<S>): HKT.HKT<S, A>;
}

export const make = <T>(schema: Schema<T>): Schema<T> =>
  IoTsSchemable.memoize(schema);

export const interpreter: {
  <S extends HKT.URIS2>(S: Schemable.Schemable2C<S, unknown>): <A>(
    schema: Schema<A>
  ) => HKT.Kind2<S, unknown, A>;
  <S extends HKT.URIS>(S: Schemable.Schemable1<S>): <A>(
    schema: Schema<A>
  ) => HKT.Kind<S, A>;
} = F.unsafeCoerce(IoTsSchema.interpreter);

Type:

import * as IoTs from 'io-ts';
import * as T from 'io-ts/lib/Type';
import * as S from './Schemable';
import { function as F, option as O } from 'fp-ts';

export const Schemable: S.Schemable1<T.URI> = {
  ...T.Schemable,
  union:
    <A, K extends string>(memberMap: { [R in K]: T.Type<A> }) =>
    (getType: (x: unknown) => O.Option<K>) => {
      const members = Object.values(memberMap) as IoTs.Mixed[];
      const name = '(' + members.map((type) => type.name).join(' | ') + ')';

      return new IoTs.UnionType<typeof members>(
        name,
        (u: unknown): u is A =>
          F.pipe(
            getType(u),
            O.chain((k) => O.fromNullable(memberMap[k])),
            O.map((t) => t.is(u)),
            O.getOrElse(F.constFalse)
          ),
        (u, c) =>
          F.pipe(
            O.Do,
            O.apS('key', getType(u)),
            O.bind('codec', ({ key }) => O.fromNullable(memberMap[key])),
            O.map(({ key, codec }) =>
              codec.validate(u, IoTs.appendContext(c, key, codec, u))
            ),
            O.getOrElseW(() => IoTs.failure(u, c))
          ),
        (a) =>
          F.pipe(
            getType(a),
            O.chain((k) => O.fromNullable(memberMap[k])),
            O.map((t) => t.encode(a)),
            O.getOrElseW((): never => {
              throw new Error(
                `no codec found to encode value in union type ${name}`
              );
            })
          ),
        members
      );
    },
};

And Eq (bonus!):

import { function as F, option as O } from 'fp-ts';
import * as Eq from 'io-ts/lib/Eq';
import * as S from './Schemable';

export const Schemable: S.Schemable1<Eq.URI> = {
  ...Eq.Schemable,
  union: (memberMap) => (getType) => ({
    equals: (a, b): boolean =>
      F.pipe(
        O.Do,
        O.apS('ta', getType(a)),
        O.apS('tb', getType(b)),
        O.filter(({ ta, tb }) => ta === tb),
        O.chain(({ ta }) => O.fromNullable(memberMap[ta])),
        O.map((eq) => eq.equals(a, b)),
        O.getOrElse(F.constFalse)
      ),
  }),
};

@anthonyjoeseph
Copy link

anthonyjoeseph commented Mar 13, 2022

As far as I can tell, the poc branch is 99% done - these are the only remaining updates I can see:

  • Codec2 is incomplete
  • Missing tests for Codec2
  • Maybe tests for TreeReporter can be extracted out from the existing tests for Decoder2? (Not sure if they're intentionally coupled)
  • Decoder2 is missing refine and parse (this could be intentional, not sure)
    • I imagine refine would accept a leaf type error, and parse would accept any arbitrary error(?)
  • TaskDecoder updates to mirror Decoder2

Does that about cover it? I'd happily make a PR to address these issues if that would help! I really really like these features - this is my number one dream for the fp-ts ecosystem - and I'd like to do whatever I can to help them get published in the experimental modules!

--

Just to address the comments suggesting changes:

@safareli (comment) - I think you're suggesting something like compoundE that only accepts a single 'child' instead of an array of them, is that right? I think that could easily be a feature request once this goes thru - maybe this could work in the meantime?

const test = pipe(
  D.struct({ a: D.string }),
  D.intersect(D.struct({ b: D.number })),
  D.mapLeft(NEA.of),
  D.mapLeft(DE.compoundE('custom error'))
)

@fernandomrtnz (comment) - Guards can be derived from schemas. Would it be possible to go from domain model schema -> Schema.interpret(Guard.schemable) -> domain model guard -> Decoder.fromGuard -> domain model decoder -> Decoder.map -> business logic decoder?

@jamiehodge (comment) - this has been a feature since #507

@kalda341 (comment) - I think this same behavior could be achieved using a 'Decoder.refine' for each case

const union = D.union(
  pipe(
    D.id<unknown>(),
    D.refine((u: unknown): u is string => typeof u === 'string', DE.message('string'))
  ),
  pipe(
    D.id<unknown>(),
    D.refine((u: unknown): u is number => typeof u === 'number', DE.message('number'))
  )
)

The interfaces for Decoder, Encoder, or any other Schema all imply the behavior of narrowing one type into another. Is there an advantage to requiring all union combinators to conform to a 'refinement-like' interface?

@kalda341
Copy link

kalda341 commented Mar 13, 2022

@anthonyjoeseph The key is being able to define a schema which supports untagged unions for Eq and Decoder. 99.9% of my schemas are fine, however there are a few deeply nested hairy bits that I don't have a lot of control over that require me to use union.

Refinements do seem to be the way to go though (I didn't really understand them up until now), so I'll have a go at refactoring my implementation to use them.

Your PR plan sounds great BTW, it's just union that is killing me!

Update: Refine doesn't allow me to actually specify the schema which should apply to the refined type, so it doesn't work for my use case. For more context, the types I need to union look like:

interface A {
  a: string;
  b: string;
}

interface B {
  c: string;
  d: number;
}

@anthonyjoeseph
Copy link

anthonyjoeseph commented Mar 13, 2022

@kalda341 afaict untagged unions of structs (or anything else) have always been supported by Decoder, poc or otherwise

const decoder: Decoder<unknown, A | B> = Dec.union(
  Dec.struct({ a: Dec.string, b: Dec.string }),
  Dec.struct({ c: Dec.string, d: Dec.string })
)

I just saw this comment, which helped me understand your solution - a more generalized union. I like it! However, since neither Eq nor Encoder are affected by the poc branch, imo it belongs on a separate feature request

@aldex32
Copy link

aldex32 commented Mar 31, 2022

Hi all,

I am curious to know if there is a plan when the experimental stage will end up? I like the new way of creating codecs etc, but I am not sure if I can use it in production, without worrying that a next release will break my code. I see this issue/ticket is opened since 2020.

@anthonyjoeseph
Copy link

anthonyjoeseph commented Apr 17, 2022

I made this PR as an attempt to complete the poc branch. These kinds of PRs ultimately end up creating more work for the maintainers - however, I understand the community's concern that io-ts has stagnated (see here #635) and I'd like to surface the work that has been ongoing but less visible (it might also save other developers from re-inventing the wheel #636)

The PR is against the 'poc' branch so that a clear diff can be seen of my changes vs gcanti's, although the intention is for this to be ultimately merged into master as a continuation of the 'experimental' modules

I've also updated the READMEs to document these changes, and added one for TaskDecoder and Type. Hopefully, these docs are legible enough that this issue can be closed and further commentary can be moved to the PR

Since the whole point is to 'de-stagnate' io-ts, my hope is to focus further discussion on smaller tweaks & implementation details, and have additional major changes or additions made as their own PRs (the luxury of having these features marked as 'experimental')

As per the notes in the poc branch, I've moved all Schemable-related code into a separate repo called io-ts-contrib. It's registered under my GitHub account, but I'd be happy to transfer ownership to a maintainer of io-ts

@treybrisbane
Copy link

I know this is a little tangental, but I just wanna restate that I feel encoding should be a potentially failing operation (previously mentioned above).
I'm not sure what the current state of the library is, but Giulio seemed receptive to the idea, so I was kinda hoping it might get included as part of this new API.

@anthonyjoeseph
Copy link

anthonyjoeseph commented Apr 17, 2022

@treybrisbane it is - a codec is now a decoder and it's dual packed together (as per gcanti's changes)

Also, please feel free to post questions like this on the PR itself - hopefully it's a bit more accessible, since the code & its documentation can serve as a record of progress rather than having to parse dozens of comments

@jamiehodge
Copy link

jamiehodge commented Jun 2, 2022

Sorry to be bothersome, but I wonder whether the explicit errors work shouldn't be delayed until another major release, allowing the current, well-functioning experimental API to pass into stability. It might sounds childish, but I really wish I could import Codec/Decoder/etc. directly from io-ts. It would also be great to get io-ts-types on the newer API.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
experimental something related to the experimental features
Projects
None yet
Development

No branches or pull requests