argonaut
Purely functional JSON parser and library in scala.
Argonaut: Purely Functional JSON in Scala
I'm using Scala & Argonaut, trying to parse the following JSON:
[
{
"name": "apple",
"type": "fruit",
"size": 3
},
{
"name": "jam",
"type": "condiment",
"size": 5
},
{
"name": "beef",
"type": "meat",
"size": 1
}
]
And struggling to work out how to iterate and extract the values into a List[MyType]
where MyType
will have name, type and size properties.
I will post more specific code soon (i have tried many things), but basically I'm looking to understand how the cursor works, and how to iterate through arrays etc. I have tried using \\
(downArray) to move to the head of the array, then :->-
to iterate through the array, then --\
(downField) is not available (at least IntelliJ doesn't think so).
So the question is how do i:
- navigate to the array
- iterate through the array (and know when I'm done)
- extract string, integer etc. values for each field -
jdecode[String]
? as[String]
?
Source: (StackOverflow)
In Scala, algebraic data types are encoded as sealed
one-level type hierarchies. Example:
-- Haskell
data Positioning a = Append
| AppendIf (a -> Bool)
| Explicit ([a] -> [a])
// Scala
sealed trait Positioning[A]
case object Append extends Positioning[Nothing]
case class AppendIf[A](condition: A => Boolean) extends Positioning[A]
case class Explicit[A](f: Seq[A] => Seq[A]) extends Positioning[A]
With case class
es and case object
s, Scala generates a bunch of things like equals
, hashCode
, unapply
(used by pattern matching) etc that brings us many of the key properties and features of traditional ADTs.
There is one key difference though – In Scala, "data constructors" have their own types. Compare the following two for example (Copied from the respective REPLs).
// Scala
scala> :t Append
Append.type
scala> :t AppendIf[Int](Function const true)
AppendIf[Int]
-- Haskell
haskell> :t Append
Append :: Positioning a
haskell> :t AppendIf (const True)
AppendIf (const True) :: Positioning a
I have always considered the Scala variation to be on the advantageous side.
After all, there is no loss of type information. AppendIf[Int]
for instance is a subtype of Positioning[Int]
.
scala> val subtypeProof = implicitly[AppendIf[Int] <:< Positioning[Int]]
subtypeProof: <:<[AppendIf[Int],Positioning[Int]] = <function1>
In fact, you get an additional compile time invariant about the value. (Could we call this a limited version of dependent typing?)
This can be put to good use – Once you know what data constructor was used to create a value, the corresponding type can be propagated through rest of the flow to add more type safety. For example, Play JSON, which uses this Scala encoding, will only allow you to extract fields
from JsObject
, not from any arbitrary JsValue
.
scala> import play.api.libs.json._
import play.api.libs.json._
scala> val obj = Json.obj("key" -> 3)
obj: play.api.libs.json.JsObject = {"key":3}
scala> obj.fields
res0: Seq[(String, play.api.libs.json.JsValue)] = ArrayBuffer((key,3))
scala> val arr = Json.arr(3, 4)
arr: play.api.libs.json.JsArray = [3,4]
scala> arr.fields
<console>:15: error: value fields is not a member of play.api.libs.json.JsArray
arr.fields
^
scala> val jsons = Set(obj, arr)
jsons: scala.collection.immutable.Set[Product with Serializable with play.api.libs.json.JsValue] = Set({"key":3}, [3,4])
In Haskell, fields
would probably have type JsValue -> Set (String, JsValue)
. Which means it will fail at runtime for a JsArray
etc. This problem also manifests in the form of well known partial record accessors.
The view that Scala's treatment of data constructors is wrong has been expressed numerous times – on Twitter, mailing lists, IRC, SO etc. Unfortunately I don't have links to any of those, except for a couple - this answer by Travis Brown, and Argonaut, a purely functional JSON library for Scala.
Argonaut consciously takes the Haskell approach (by private
ing case classes, and providing data constructors manually). You can see that the problem I mentioned with Haskell encoding exists with Argonaut as well. (Except it uses Option
to indicate partiality.)
scala> import argonaut._, Argonaut._
import argonaut._
import Argonaut._
scala> val obj = Json.obj("k" := 3)
obj: argonaut.Json = {"k":3}
scala> obj.obj.map(_.toList)
res6: Option[List[(argonaut.Json.JsonField, argonaut.Json)]] = Some(List((k,3)))
scala> val arr = Json.array(jNumber(3), jNumber(4))
arr: argonaut.Json = [3,4]
scala> arr.obj.map(_.toList)
res7: Option[List[(argonaut.Json.JsonField, argonaut.Json)]] = None
I have been pondering this for quite some time, but still do not understand what makes Scala's encoding wrong. Sure it hampers type inference at times, but that does not seem like a strong enough reason to decree it wrong. What am I missing?
Source: (StackOverflow)
Model:
case class DateValue(year: Option[Int] = None, month: Option[Int] = None)
Argonaut-based decoder:
implicit val dateValueDecode = casecodec2(DateValue.apply, DateValue.unapply)("year", "month")
This allows to parse such format:
{
"year": "2013",
"month": "10"
}
Now I want to simplify JSON format and use
"2013/10"
instead, but leave my model unchanged. How to accomplish this with Argonaut?
Source: (StackOverflow)
I'm trying to encode/decode following case class
case class Person(name: String, age: Int, childs: List[Person])
using the following code:
object Person {
implicit def PersonCodecJson =
casecodec3(Person.apply, Person.unapply)("name", "age", "childs")
}
with argonaut, but I'm getting the following compiler error:
could not find implicit value for evidence parameter of type argonaut.EncodeJson[List[Person]]
Obviously, the compiler doesn't know how to handle encoding of List[Person], because it's used inside the definition of how to encode Person.
Is there a clever way to tell argonaut how to encode it the right way?
Update: Thanks to Travis: It's compiling now, but it's not working.
implicit def PersonCodecJson : CodecJson[Person] =
casecodec3(Person.apply, Person.unapply)("name", "age", "childs")
leads to an infinite recursion and a stack overflow trying to decode
val input = """
[{"name": "parent1", "age": 31, "childs": [{"name": "child1", "age": 2, "childs": []}]},
{"name": "parent2", "age": 29, "childs": []}
]
"""
val persons = input.decodeOption[List[Person]].getOrElse(Nil)
results in
at Person$.PersonCodecJson(main.scala:8)
at Person$.PersonCodecJson(main.scala:8)
at Person$.PersonCodecJson(main.scala:8)
at Person$.PersonCodecJson(main.scala:8)
at Person$.PersonCodecJson(main.scala:8)
at Person$.PersonCodecJson(main.scala:8)
at Person$.PersonCodecJson(main.scala:8)
at Person$.PersonCodecJson(main.scala:8)
[debug] Thread run-main-1 exited.
[debug] Interrupting remaining threads (should be all daemons).
[debug] Sandboxed run complete..
java.lang.RuntimeException: Nonzero exit code: 1
at scala.sys.package$.error(package.scala:27)
at sbt.BuildCommon$$anonfun$toError$1.apply(Defaults.scala:1653)
at sbt.BuildCommon$$anonfun$toError$1.apply(Defaults.scala:1653)
at scala.Option.foreach(Option.scala:236)
at sbt.BuildCommon$class.toError(Defaults.scala:1653)
at sbt.Defaults$.toError(Defaults.scala:35)
at sbt.Defaults$$anonfun$runTask$1$$anonfun$apply$36$$anonfun$apply$37.apply(Defaults.scala:656)
at sbt.Defaults$$anonfun$runTask$1$$anonfun$apply$36$$anonfun$apply$37.apply(Defaults.scala:654)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:42)
at sbt.std.Transform$$anon$4.work(System.scala:64)
Is this approach to decode this nested json even valid? Do I have to tackle it completely different? Or is just another small piece of code missing?
Source: (StackOverflow)
In the Argonaut DecodeJson
trait there is a method |||
for chaining together decoders, so that the first succeeding decoder is chosen. There is also a similar method in DecodeResult
which has the same effect. It looks at first glance as though one of these would be what we want for decoding multiple subtypes of a common trait. However, how do we actually do this?
The first problem is that the argument to |||
has to be a DecodeJson
decoding a supertype of the type that the callee is supposed to be decoding (and similarly for DecodeResult
). I would expect that such a decoder would be able to decode all of the subtypes of the common supertype, so this seems like a recipe for infinite recursion!
We can get around this using the following ugly asInstanceOf
hack while defining the CodecJson
for the supertype:
c => c.as[A] ||| c.as[Foo](implicitly[DecodeJson[B]].asInstanceOf[DecodeResult[Foo]])
However, then there is still a problem when decoding more than two subtypes. Assume there are subtypes A
, B
and C
of Foo
. Now what? How do we add yet another alternative to this decoding expression? .asInstanceOf[DecodeResult[AnyRef]]
is going to destroy the type-safety of the parsed result (as if we hadn't already discarded type-safety at this point!). And then we are quickly going to run out of options with 4, 5, or 6 alternatives.
EDIT: I will gladly accept as an answer any alternative approach to decoding more-than-2-wide subtype hierarchies using Argonaut.
Source: (StackOverflow)
I'm having a hard time slogging through the Argonaut documentation, so I figured I'd just ask for a simple example.
val input = """{"a":[{"b":4},{"b":5}]}"""
val output = ??? // desired value: List(4, 5)
I can get a cursor down to the array:
Parse.parse(input).map((jObjectPL >=> jsonObjectPL("a") >=> jArrayPL)(_))
// scalaz.\/[String,Option[scalaz.IndexedStore[argonaut.Argonaut.JsonArray,
// argonaut.Argonaut.JsonArray,argonaut.Json]]] =
// \/-(Some(IndexedStoreT((<function1>,List({"b":4}, {"b":5})))))
But then what? Am I on the right track? Should I even be using cursors for this?
Edit - Here's some progress, I guess. I've written a decoder for the list:
Parse.parse("""[{"b": 4}, {"b": 5}]""")
.map(_.as(IListDecodeJson(DecodeJson(_.downField("b").as[Int]))))
// scalaz.\/[String,argonaut.DecodeResult[scalaz.IList[Int]]] =
// \/-(DecodeResult(\/-([4,5])))
Edit - Slowly starting to put it together...
Parse.parse(input).map(_.as[HCursor].flatMap(_.downField("a").as(
IListDecodeJson(DecodeJson(_.downField("b").as[Int])))))
// scalaz.\/[String,argonaut.DecodeResult[scalaz.IList[Int]]] =
// \/-(DecodeResult(\/-([4,5])))
Edit - So I guess my best solution so far is:
Parse.parse(input).map(_.as(
DecodeJson(_.downField("a").as(
IListDecodeJson(DecodeJson(_.downField("b").as[Int])).map(_.toList)
))
))
Feels a bit verbose, though.
Source: (StackOverflow)
I'm new to Scala, and here I'm trying to create a generic json converter based on Argonaut. I've tried to search on google and stackoverflow, but so far I have no clue.
Here is the snippet of my code.
import org.springframework.http.converter.AbstractHttpMessageConverter
import org.springframework.http.{MediaType, HttpInputMessage, HttpOutputMessage}
import scala.io.Source
import argonaut._,Argonaut._
case class Currency(code: String)
object Currency {
implicit def CurrencyCodecJson: CodecJson[Currency] = casecodec1(Currency.apply, Currency.unapply)("code")
}
case class Person(firstName: String, lastName: String)
object Person {
implicit def PersonCodecJson: CodecJson[Person] = casecodec2(Person.apply, Person.unapply)("firstName", "LastName")
}
class ArgonautConverter extends AbstractHttpMessageConverter[Object](new MediaType("application", "json", Charset.forName("UTF-8")), new MediaType("application", "*+json", Charset.forName("UTF-8"))) {
val c = classOf[Currency]
val p = classOf[Person]
def writeInternal(t: Object, outputStream: OutputStream) = {
val jsonString = t match {
case c:Currency => c.asJson.ToString()
case p:Person => p.asJson.ToString()
}
def supports(clazz: Class[_]): Boolean = clazz.isAssignableFrom(classOf[CodecJson])// clazz == classOf[Currency] || clazz == classOf[LegalEntity]
def readInternal(clazz: Class[_ <: Object], inputStream: InputStream): Object = {
val jsonString = Source.fromInputStream(inputStream).getLines.mkString
val jsonObject = clazz match {
case `c` => jsonString.decodeOption[Currency]
case `p` => jsonString.decodeOption[Person]
}
jsonObject match {
case Some(j) => j
case None => null
}
}
}
What I'm trying to do is to generalize such that I don't need to keep adding the match for every new model class (like Currency and Person in this case) that I will add in the future.
Source: (StackOverflow)
I am trying to change the implementation of this function from using plays json library like so
def apply[T](action: => ApiResponse[T])(implicit tjs: Writes[T], ec: ExecutionContext): Future[Result] = {
action.fold(
err =>
Status(err.statusCode) {
JsObject(Seq(
"status" -> JsString("error"),
"statusCode" -> JsNumber(err.statusCode),
"errors" -> Json.toJson(err.errors)
))
},
t =>
Ok {
JsObject(Seq(
"status" -> JsString("ok"),
"response" -> Json.toJson(t)
))
}
)
}
to use argonaut like so
def apply[T](action: => ApiResponse[T])(implicit encodeJson: EncodeJson[T], ec: ExecutionContext): Future[Result] = {
action.fold(
err =>
Status(err.statusCode) {
Json(
"status" -> jString("error"),
"statusCode" -> jNumber(err.statusCode),
"errors" -> err.errors.asJson
)
},
t =>
Ok {
Json(
"status" -> jString("ok"),
"response" -> t.asJson
)
}
)
}
but I get
Cannot write an instance of argonaut.Json to HTTP response. Try to
define a Writeable[argonaut.Json]
for both the Status{} block and Ok{} block and I got a helpful answer to this problem here https://groups.google.com/forum/#!topic/play-framework/vBMf72a10Zc
so I tried to create the implicit conversion like so
implicit def writeableOfArgonautJson(implicit codec: Codec): Writeable[Json] = {
Writeable(jsval => codec.encode(jsval.toString))
}
which I think converts the json object to a string and provides it to codec.encode which should convert it to Array[Bytes] but I get
Cannot guess the content type to use for argonaut.Json. Try to define
a ContentTypeOf[argonaut.Json]
jsval.nospaces.getBytes also return Array[Bytes] so I dont know if that can be used to help at all
so while I think that last error message means I just need to tell play that it should use content type application.json I also feel like this might be an unnecessary rabbit hole and there should be an easier way to do this.
edit: it wasnt such a rabbit hole as defining the contentType has things compiling at least but I still would like to know if this is correct
Source: (StackOverflow)
I'm using Argonaut to parse JSON strings. There is a requirement: If any field is not provided, or is empty or blank, give it the string "not supplied" instead.
I have a solution, but it seems very complicated:
case class User(name: String, home: String)
implicit def UserDecodeJson: DecodeJson[User] = DecodeJson(j => for {
name <- (j --\ "name").as[BlankAsNonSupplied]
home <- (j --\ "home").as[BlankAsNonSupplied]
} yield User(name.value, home.value))
case class BlankAsNonSupplied(value: String)
implicit def BlankAsNonSuppliedDecodeJson: DecodeJson[BlankAsNonSupplied] = DecodeJson.withReattempt(a => {
val v = a.success.map(_.focus).flatMap(_.string)
.filterNot(_.trim.isEmpty)
.map(BlankAsNonSupplied.apply).getOrElse(BlankAsNonSupplied("not supplied"))
DecodeResult.ok(v)
})
You can see the BlankAsNonSuppliedDecodeJson
one is very complicated, and hard to understand. Is there any way to make it(or the whole example) simpler?
Source: (StackOverflow)
I have a case class and companion object:
case class Person private(name: String, age: Int)
object Person {
def validAge(age: Int) = {
if (age > 18) age.successNel else "Age is under 18".failureNel
}
def validName(name: String) = {
name.successNel
}
def create(name: String, age: Int) = (validAge(age) |@| validName(name))(Person.apply)
}
I want to use Argonaut to parse some JSON and return a Person OR some errors, as a list. So I need to:
- Read the JSON from a string, and validate the string is correctly formed
- Decode the JSON into a Person, or List of error strings.
I want to return errors in the form of something I can turn into some more JSON like:
{
errors: ["Error1", "Error2"]
}
I first tried using Argonauts decodeValidation method, which returns a Validation[String, X]. Unfortunately, I need a List of errors.
Any suggestions?
Source: (StackOverflow)
Parsing a JSON string like """["test",["aaa", "bbb", "ccc"]]"""
is easy using scala.util.parsing.json
:
// def jsonResponse = scala.io.Source.fromURL("http://en.wikipedia.org/w/api.php?format=json&action=opensearch&search=test").mkString
def jsonResponse = """["test",["aaa", "bbb", "ccc"]]"""
def responseStrings = scala.util.Try[List[String]] {
val Some(List("test", words: List[_])) = scala.util.parsing.json.JSON.parseFull(jsonResponse)
words.map{case w: String => w}
}
responseStrings.get foreach println
prints
aaa
bbb
ccc
How can I do this in such an easy way using Argonaut?
Source: (StackOverflow)
I have two case classes, let's call them case class User
& case class Ticket
. Both of these case classes implement the operations required to be members of a the same TypeClass
, in this case Argonaut's EncodeJson
.
Is it possible to view these two separate types as the same without creating an empty marker type that they both extend?
trait Marker
case class User extends Marker
case class Ticket extends Marker
To make this concrete,we have two separate functions that return these case classes:
case class GetUser(userId: Long) extends Service[Doesn't Matter, User] {
def apply(req: Doesn't Matter): Future[User] = {
magical and awesome business logic
return Future[User]
}
}
case class GetTicket(ticketId: Long) extends Service[Doesn't Matter, Ticket] {
def apply(req: Doesn't Matter): Future[Ticket] = {
magical and awesome business logic
return Future[Ticket]
}
}
I would like to compose these two Services so that they return the same type, in this case argonaut.Json
, but the compiler's response to an implicit conversions is "LOLNO"
implicit def anyToJson[A](a: A)(implicit e: EncodeJson[A]): Json = e(a)
Any ideas? Thanks!
Source: (StackOverflow)
In Argonaut, how does one easily rename the corresponding JSON property name in instances where a case class contains an Either.
For example, given this definition:
case class Foo(f: String)
case class Bar(b: String)
case class FooBar(e: Either[Foo, Bar])
implicit def FooCodecJson: CodecJson[Foo] = casecodec1(Foo.apply, Foo.unapply)("f")
implicit def BarCodecJson: CodecJson[Bar] = casecodec1(Bar.apply, Bar.unapply)("b")
implicit def FooBarCodecJson: CodecJson[FooBar] = casecodec1(FooBar.apply, FooBar.unapply)("e")
converting a FooBar
to JSON like FooBar(Right(Bar("hello"))).asJson.spaces4
results in the following:
{
"e" : {
"Right" : {
"b" : "hello"
}
}
}
What is the easiest way to rename the "Right" to something more meaningful in the output above? (My actual scenario has many case classes with many Eithers, so I am looking for the most concise way possible.)
Source: (StackOverflow)
I'm writing a library to convert JSON responses from an API for backwards compatibility reasons. And what I need to do is take in arbitrary JSON, and change certain field names. I'm using scala and argonaut, but I don't see any way in the docs or examples of changing the FIELD names, only the values.
Source: (StackOverflow)
I have a Scala project that uses a bunch of Java code, for example this Java source:
public enum Category { FOO, BAR };
I then have a bunch of Scala case classes that I serialise to and from JSON using Argonaut like this:
case class Thing (a: String, b: Int, c: Float)
object Thing {
implicit val j = casecodec3 (Thing.apply, Thing.unapply)("a", "b", "c")
implicit val e: Equal[Guild] = Equal.equal (_ == _)
}
Fine, now I want to write a Scala case class that uses a Java enum like so:
case class Thing (a: String, b: Int, c: Float, d: Category)
object Thing {
implicit val j = casecodec4 (Thing.apply, Thing.unapply)("a", "b", "c", "d")
implicit val e: Equal[Guild] = Equal.equal (_ == _)
}
This will yield a compilation error because there is no implicit codec for the Category enum.
I guess I could write my own codec specifically for dealing with the Category enum by doing something like this:
package object ArgonautImplicits {
implicit val dx: DecodeJson[Category] = StringDecodeJson.map(x => Category.valueOf(x))
implicit val ex: EncodeJson[Category] = EncodeJson(x => jString(x.toString))
}
But I want to know if there is a way to write a single codec that will automatically handle any Java enum.
Source: (StackOverflow)