EzDevInfo.com

nscala-time

A new Scala wrapper for Joda Time based on scala-time

How to add a new column with day of week based on another in dataframe?

I have a field in a data frame currently formatted as a string (mm/dd/yyyy) and I want to create a new column in that data frame with the day of week name (i.e. Thursday) for that field. I've imported

import com.github.nscala_time.time.Imports._

but am not sure where to go from here.


Source: (StackOverflow)

Iterate over dates range (the scala way)

Given a start and an end date I would like to iterate on it by day using a foreach, map or similar function. Something like

(DateTime.now to DateTime.now + 5.day by 1.day).foreach(println)

I am using https://github.com/nscala-time/nscala-time, but I get returned a joda Interval object if I use the syntax above, which I suspect is also not a range of dates, but a sort of range of milliseconds.


Source: (StackOverflow)

Advertisements

Scala datetime between nscala-time and Java 8 data time

If before Java 8 releases, definitely I will choose nscala-time for my scala play project.

However since Java 8 releases, recommending using Java 8 data and time instead of joda time for Java project.

So how about for Scala project? Should we stick using nscala-time or switch? If using Java 8 date and time, need to use mutable.

Also, which library has good support for play json library?


Source: (StackOverflow)

Convert nscala-time Datetime to java.util.Date

I am trying to cast com.github.nscala_time.time.Imports.DateTime from nscala-time (a wrapper of joda-time) to java.util.Date

activeUntil.toDate()

But I get this error

value toDate is not a member of Option[com.github.nscala_time.time.Imports.DateTime]

Obviuosly is not the right way to do it. Is there a way to do this?

Thank you in advance


Source: (StackOverflow)

How can I use nscala_time inside spark-shell?

I'm trying to test some code out in the spark-shell and I need to set some time fields. We're using nscala_time for DateTime functionality. When I run

$ scala -cp `ls -1 | tr "\\n" ":"`

from the directory with my staged jars, everything works fine and I can run

scala> import com.github.nscala_time.time.Imports._
import com.github.nscala_time.time.Imports._

scala> val current = DateTime.now
current: org.joda.time.DateTime = 2015-04-23T10:44:35.984-07:00

however when I try the same thing with spark-shell 1.3.0

$ spark-shell -cp `ls -1 | tr "\\n" ":"

I end up with errors doing the same thing as in the scala console

scala> import com.github.nscala_time.time.Imports._
import com.github.nscala_time.time.Imports._

scala> val current = DateTime.now
java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
    at com.github.nscala_time.time.LowPriorityOrderingImplicits$class.ReadableInstantOrdering(Implicits.scala:64)
    at com.github.nscala_time.time.Imports$.ReadableInstantOrdering(Imports.scala:20)
    at com.github.nscala_time.time.OrderingImplicits$class.$init$(Implicits.scala:56)
    at com.github.nscala_time.time.Imports$.<init>(Imports.scala:20)
    at com.github.nscala_time.time.Imports$.<clinit>(Imports.scala)
    at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:22)
    at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:27)
    at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
    at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
    at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
    at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
    at $iwC$$iwC$$iwC$$iwC.<init>(<console>:37)
    at $iwC$$iwC$$iwC.<init>(<console>:39)
    at $iwC$$iwC.<init>(<console>:41)
    at $iwC.<init>(<console>:43)
    at <init>(<console>:45)
    at .<init>(<console>:49)
    at .<clinit>(<console>)
    at .<init>(<console>:7)
    at .<clinit>(<console>)
    at $print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
    at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
    at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
    at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
    at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
    at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
    at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
    at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
    at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
    at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
    at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
    at org.apache.spark.repl.Main$.main(Main.scala:31)
    at org.apache.spark.repl.Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Anyone have any ideas why I can't use nscala_time in spark-shell?


Source: (StackOverflow)

How does DateTime.now.hour(0) get a new DateTime in nscala-time?

I use nsacala-time package from https://github.com/nscala-time/nscala-time.

import com.github.nscala_time.time.Imports._
DateTime.now.hour(0)
res0: org.joda.time.DateTime = 2015-06-11T22:33:52.266+08:00

will get a new DateTime object. But in the source,https://github.com/nscala-time/nscala-time/blob/master/src/main/scala/com/github/nscala_time/time/RichDateTime.scala ,it does not have a hour function with signature (Int). It seems that withHour(hour: Int) do the exact work.


Source: (StackOverflow)

extends Super in com.github.nscala_time.time.DurationBuilder

I recently downloaded source code of com.github.nscala_time package version 2.11, After set up dependency in Maven, I got lots of errors, I checked one file com.github.nscala_time.time.DurationBuilder, it got a line like:

class DurationBuilder(val underlying: Period) extends Super {..

There are no class or type named "Super" in the same package or imported packages. I am wondering scala has a type called "Super"? the Eclipse scala 2.11 compiler complains about cannot find type "Super"


Source: (StackOverflow)