Questions tagged [scala]

Scala is a general-purpose programming language principally targeting the Java Virtual Machine. Designed to express common programming patterns in a concise, elegant, and type-safe way, it fuses both imperative and functional programming styles. Its key features are: an advanced static type system with type inference; function types; pattern-matching; implicit parameters and conversions; operator overloading; full interoperability with Java; concurrency

0
votes
0answers
8 views

Stuck on writing a polymorphic transpose function that accepts and returns RDDs of either Arrays or Seqs/Vectors

I am refactoring a Scala library that interfaces with Spark to use Vectors where it makes sense. I would like to provide functions that interface directly with Spark the ability to work with either ...
0
votes
1answer
15 views

Better way to write cascading if statements in Scala?

In JavaScript we can rewrite: if (ua.isEmpty()) { return false; } else if (ua.contains('curl')) { return false; } into this for clear code: switch(true) { case ua.isEmpty(): ...
0
votes
0answers
7 views

Scala - XML parsing tag not working properly

I'm trying to parse a xml file to convert it into a csv in order to be furtherly processed by another scala program. In this xml file, I have tags like this : <Event EventTime="2018-12-25T22:...
1
vote
0answers
10 views

Why does pureconfig not find my implicit readers?

I use double to encode boolean value in a configuration file. PureConfig does not find a way to cast it while reading the configuration. Here is some code to reproduce the behavior. import com....
0
votes
0answers
10 views

Scala repl prints nothing

I've installed scala on Ubuntu 18.04.2 LTS: :~$ scala -version Scala code runner version 2.11.12 -- Copyright 2002-2017, LAMP/EPFL tying to print something in the REPL mode terminal shows nothing, ...
0
votes
0answers
13 views

How do I stream data to Neo4j using Spark

I am trying to write streaming data to Neo4j using Spark and am having some problems (I am very new to Spark). I have tried setting up a stream of word counts and can write this to Postgres using a ...
-1
votes
0answers
16 views

Websocket client library that can be used in JVM Scala & Scala.js [on hold]

Similar to my previous question, I need a websocket client implementation that compiles to JVM Scala & Scala.js for a shared library. As far as I can see, most implementations can only be used ...
2
votes
0answers
10 views

Protect system when using scala interpreter api

I have created simple REPL bot for scala. It runs in linux environment and processes written scala code in dialogs and gives the result: For example user| 1+1 bot | res0: Int = 2 user| res0 + 3 bot ...
0
votes
1answer
10 views

Play app with dependencies on 3 other play apps

I am running a play app, this app has dependencies on 3 other play apps, i.e. the first play app has 3 dependencies in build.sbt. Now of course all these 4 apps have their own route.conf file. The ...
0
votes
0answers
16 views

spark sql query in playframework shows empty result

When I run a GroupBy query in spark-shell it shows me perfect results but when I run the same query using spark SQL in playframework with the same spark version it shows me an empty result. I have a ...
-1
votes
0answers
26 views

Combining results of multiple data frames: merging file in Scala

I've a requirement of merging files written by 3 different data frames in HDFS into a single file and finally putting them to local filesystem. This task is a child's play in normal Hadoop commands ...
0
votes
0answers
9 views

Flink : org.apache.flink.api.common.InvalidProgramException Object KafkaSink$$anon$1@58f174d9 is not serializable

I am writing a keyserialization scheme for a Flink Kafka Producer. Here is my source code. @SerialVersionUID(100L) class KafkaSink private (dataStream: DataStream[Map[String, Any]], source: String) ...
0
votes
1answer
26 views

spark scala transform a dataframe/rdd

I have a CSV file like below. PK,key,Value 100,col1,val11 100,col2,val12 100,idx,1 100,icol1,ival11 100,icol3,ival13 100,idx,2 100,icol1,ival21 100,icol2,ival22 101,col1,val21 101,col2,val22 101,idx,...
0
votes
1answer
16 views

Scala Play: How to render Form mappings with Repeated and Nested values?

I'm working on a Scala Play 2.7.x (you may checkout the project here play-silhouette-seed googleauth branch) and I have a form defined as: object TotpSetupForm { val form = Form( mapping( ...
0
votes
1answer
46 views

Better functional in/out arguments style in Scala

This code works fine so far but i don't like the tuples everywhere like this which leads to using _.1 and _2 etc which is less expressive. I can implement wrapper classes that which have more ...
0
votes
0answers
10 views

download gz file on clicking a url and convert to csv using scala

I am really messing with syntax here need help... I have a URL, on clicking of which a sample.csv.gz file gets downloaded Please can someone help me fill the syntactic gaps below: val outputFile ="...
0
votes
1answer
11 views

Is there a way to convert Scala-Spark DataFrame to HTML table, or converting DataFrame to Scala map then convert to Json and then HTML?

I run some test and get result which is small DataFrame, with approx 3-6 columns and 10-20 row. And now I want to send this email to my colleague, and for ease I want this to be in tabular format as ...
0
votes
0answers
28 views

ClassNotFoundException when trying to start application with Docker

I have a Scala SBT project with two modules "game" and "database". I want to run those two modules in Docker, but I always get a ClassNotFoundException when I start it with "docker-compose up". I ...
0
votes
0answers
18 views

“illegal start of simple expression” in XML file in Scala

So I am writing a logic for fetching the data from another file in xml in Scala programming language. I have practiced the logic in Scala worksheet and it works fine but when I write the same logic ...
1
vote
0answers
16 views

Spark Structured Streaming unable to write parquet data to HDFS

I'm trying to write data to HDFS from a spark structured streaming code in scala. But I'm unable to do so due to an error that I failed to understand On my use case, I'm reading data from a Kafka ...
0
votes
0answers
14 views

statement commit working intermittently with Apache Phoenix JDBC

I am attempting to read in rows from an Apache Phoenix table that contains file names and a column each for the time I start and finish processing the file. I am seeing inconsistent behavior when ...
-5
votes
0answers
31 views

Checkpointing RDBMS table in Spark (Scala) [on hold]

I have my spark scala SBT project where I am writing data to different RDBMS tables , How can I validate that data are written to the tables ? is there any checkpointing table that we can use?
2
votes
1answer
32 views

Timer that can be used in JVM Scala & Scala.js

At the moment I am working on a project that is cross-compiled to Scala.js and normal JVM Scala. Now I need to implement a timer (for reconnecting a websocket) that triggers a function every x seconds....
2
votes
0answers
40 views

lazy initialization for Scala?

I have a Scalatest with the following structure which is emasculated within a class var dataHolder: Holder // some lazy initialization here? def runTest(filePath: String): Unit = { //Test 1 ...
1
vote
1answer
30 views

Why AWS is rejecting my connections when I am using wholeTextFiles() with pyspark?

I use sc.wholeTextFiles(",".join(fs), minPartitions=200) to download 6k XMLs files from S3 (every file 50MBs) on single dataproc node with 96cpus. When I have minPartitions=200 AWS is rejecting my ...
1
vote
1answer
20 views

AutoRefineV not picking up explicit inference from Map?

I have a refined type definition like this: type D = String Refined Regex "(a|b)" I can use the refinement in a single line expressing the value but for some reason autoRefineV is not picking it up ...
0
votes
0answers
18 views

Converting synthetic function “<<” in Scala to MongoDB command

I'm trying to update some fields in the MongoDB Shell using the aggregation framework. I actually am wanting to convert the ip addresses currently as a String type in the collection into a Long type ...
0
votes
2answers
45 views

Subtract Seq[A] from Seq[B]

I have two classes A and B. Both of them have the same property: id and many other different properties. How can I subtract Seq[A] from Seq[B] by matching the id's?
2
votes
2answers
45 views

Scala: Combine Either per the whole List with Either per elements

I have a list of Either, which represents error: type ErrorType = List[String] type FailFast[A] = Either[ErrorType, A] import cats.syntax.either._ val l = List(1.asRight[ErrorType], 5.asRight[...
0
votes
2answers
52 views

How do I write a function in scala

I have the following code: val ls = List(0, -1, 2, -2) var removeNegative = List[Int]() def removeNegative(ls: List[Int]): Int = ls match { case Nil => 0 case l::for(ls <- ls){ var ...
0
votes
0answers
53 views

How to obtain class type from case class value

I have the following code snippet: case class Test(i:Int) val b = Test // b: Test.type val a = Test(1) // a: Test Is there a way to get from value a which has a Test type to Test.type?
3
votes
1answer
39 views

Hbase doesn't work well with spark-submit

I have an app that does some work and at the end it needs to read some file from hdfs and store it into hbase. The app runs when using master local with no issue using apache spark, but when I run it ...
2
votes
1answer
42 views

How to do a `getOrWaitUntilNonEmpty` as a single liner?

I have a high-level code structure that looks like this: val block: (=> Option[Seq[String]]) = ... val matches = block().get.toArray The problem is that this code may fail i.e. .get being None ...
0
votes
4answers
40 views

How do I append an element to a list in Scala

I have a list: val k = List(1,2,3,4,-69,78) and would like to remove all negative elements in the list I have the following code: val k = List(1,2,3,4,-69,78) val a = List() for( k <- k){ ...
0
votes
1answer
44 views

Scala Implicit syntax in polymorphic methods

I am a Scala noob reading through a parsing library, and have reached some syntax I do not understand: def parseA[_: P] = P("a") val Parsed.Success(value, successIndex) = parse("a", parseA(_)) I ...
1
vote
2answers
24 views

scala - mock function and replace implementation in tests

I'm using scalatest and when I test my code, I want to replace an implementation of a function to return something else when running tests. In JavaScript its really simple and I thought I could do ...
1
vote
0answers
28 views

How to guarantee process once only for REST API calls for all records of a SPARK dataframe

I wanted to use foreachPartition on a dataframe to send data of each row ONLY once to a REST API. val aDF= ... ///sc.parallelize(0 to 1000000,4) i.e a dataframe ~1M rows aDF.foreachPartition(rows =&...
4
votes
0answers
30 views

Hive partitioned table reads all the partitions despite having a Spark filter

I'm using spark with scala to read a specific Hive partition. The partition is year, month, day, a and b scala> spark.sql("select * from db.table where year=2019 and month=2 and day=28 and a='y' ...
-2
votes
0answers
40 views

is it good approach to use akka with java as a fresher without having any background knowledge of akka or scala? [on hold]

I am a fresher and I am starting with akka using java. Is it a good idea to start akka with java without knowing scala or having any background knowledge of akka.
0
votes
1answer
21 views

How about Akka-grpc with flatbuffers, thoughts?

I have started working with Akka-GRPC using ProtoBuffers, the samples online are very clean and concise, but with Flatbuffers being faster than ProtoBuffers and GRPC stating out of box support for ...
3
votes
1answer
56 views

Scala how to specify a return type of tuple in a tuple

I am trying to specify a return type of tuple within a tuple: class First { def tupleReturnType(): (Any, Int) = { val tup = (1, 2, 3) // This can be variable in length val i = 4 ...
0
votes
1answer
27 views

Force java jar to not use classpath packages on EMR

I am trying to run a fat jar through spark-submit on EMR. I am running into a problem related to package dependencies. This project depends on google adwords library which I have included in build.sbt....
2
votes
1answer
30 views

How to put data from various tables into separate lists via one request

For instance, I have some entity. Each entity has some attributes. DB looks something about: entity entity_attribute ╔════╦════════╗ ╔════╦════════════╦═══════════╗ ║ id ║ name ...
0
votes
1answer
27 views

java.lang.ClassNotFoundException with spark-submit [duplicate]

I installed spark 2.4.3 and sbt 1.2.8. I'm under windows 10 pro. java -version gives: java version "1.8.0_211" Java(TM) SE Runtime Environment (build 1.8.0_211-b12) Java HotSpot(TM) 64-Bit Server VM ...
0
votes
0answers
13 views

Configure log4j.properties for Kafka appender, error when parsing property bootstrap.servers

I want to add a Kafka appender to the audit-hdfs log in a Cloudera cluster. I have successfully configured a log4j2.xml file with a Kafka appender, I need to convert this log4j2.xml into a log4j2....
1
vote
0answers
23 views

Are 2FA TOTP scratch or recovery codes order-sensitive?

I'm working on a Google TOTP extension for Play-Silhouette, see the corresponding Play-Silhouette-Seed project here and was wondering whether the scratch or recovery codes are order-sensitive. By ...
4
votes
2answers
57 views

Scala: no-name parameters in function with List and Option

2 different examples, the first one works: import cats.syntax.either._ val e = 10.asRight[String] def i2s(i:Int):String = i.toString e.map(i => List(i2s(i))) //using explicit parameter e....
0
votes
0answers
25 views

Flink: The implementation of the RichSinkFunction is not serializable

I am trying to sink stream to s3 bucket and implements the Encoder interface. case class Info(vecId: Long, bkCode: String, state: String) class S3Encoder extends Encoder[Info] { private val ...
0
votes
1answer
18 views

Calculating area of convex figures from given vertices in Scala

I have been given a set of vertices, which can vary from 3 to 20, and I need to implement a generic method to calculate the area of the polygon defined by those vertices [This vertices are placed in a ...
2
votes
1answer
34 views

Creating JSON file with for loop in scala

My requirement is to convert two string and create a JSON file(using spray JSON) and save in a resource directory. one input string contains the ID and other input strings contain the score and topic ...

http://mssss.yulina-kosm.ru