• Scalikejdbc connecting to MySQL


    Scalakejdbc is a Scala JDBC framework. The official website says easy-to-use and very flexible, easy to use and flexible~ 1. Add dependency <dependency> <groupId>org.scalikejdbc</groupId> <artifactId>scalikejdbc_2.11</artifactId> <version>3.3.2</version> </dependency> <dependency> <groupId>com.h2database</groupId> <artifactId>h2</artifactId> <version>1.4.197</version> </dependency> <dependency> <groupId>ch.qos.logback</groupId> <artifactId>logback-classic</artifactId> <version>1.2.3</version> </dependency> <dependency> <groupId>mysql</groupId> <artifactId>mysql-connector-java</artifactId> <version>8.0.15</version> </dependency> 2. Upper code import scalikejdbc._ object MySQLDAO { def main(args: Array[String]): Unit = […]

  • Learn more about groovy and scala classes in Java


    Preface Java inherits the platform, not the language. There are more than 200 languages that can run on a JVM, and inevitably one of them will eventually replace the Java language as the best way to write a JVM program. This series will explore three next-generation JVM languages: groovy, Scala and clojure, and compare and […]

  • Talk about the expression expression of JavaScript and Scala


    Let’s take a look at the following simple JavaScript code. I called function f on line 10, where the second and third arguments passed in are comma expressions. The implementation of function f will check the type of these two parameters. If it is a function, execute the function call, and then print its return […]

  • RDD partition algorithm in spark


    Partition algorithm of RDD in spark def positions(length: Long, numSlices: Int): Iterator[(Int, Int)] = { (0 until numSlices).iterator.map { i => val start = ((i * length) / numSlices).toInt val end = (((i + 1) * length) / numSlices).toInt (start, end) } } /** Number of numslices partitions (0 until numslices). Iterator is to change […]

  • Spark framework: build Scala development environment under win10 system


    I. Scala environment foundation Scala wraps Java related classes and interfaces, so it depends on the JVM environment. JDK 1.8 Scala dependency Scala 2.11 installation version Idea 2017.3 development tools 2. Configure Scala decompression version 1) note that there is no space or Chinese in the path 2) configure environment variables Add to path directory […]

  • Scala common operations


    Scala common operations Version informationpython3.7pyspark2.4.0 from pyspark import SQLContext,SparkContext,SparkConf conf = SparkConf() sc = SparkContext(conf=conf) sqlContext = SQLContext(sc) #Load CSV file data = sqlContext.read.format(“csv”).option(“header”,”true”).load(“union_order_user”) #Sort by field descending sorted=data.sort(“created_at”,ascending = False) #Show the first 100 records by default show 20 records sorted.show(100)

  • 1. Scala language overview


    Chapter 1, an overview of the Scala language == knowledge structure == Scala is divided into several stages. 1. Scala language overview 2. Basic knowledge of Scala Scala data structure Scala is object-oriented Scala functional programming 1. Scala language overview 1.1 learning tasks 1. Understand the features of Scala languageLearn to configure the Scala environmentConfigure […]

  • Example code of bubble sort, merge sort and fast sort in Scala


    1. Bubble sorting def sort(list: List[Int]): List[Int] = list match { case List() => List() case head :: tail => compute(head, sort(tail)) } def compute(data: Int, dataSet: List[Int]): List[Int] = dataSet match { case List() => List(data) case head :: tail => if (data <= head) data :: dataSet else head :: compute(data, tail) } […]

  • An example of how to generate random numbers using Scala


    Using Scala to generate random numbers 1. Simple version: /* 1.you can use scala.util.Random.nextInt(10) to produce a number between 1 and 10 2.at the same time,you nextInt(100) to produce a number between 1 and 100 */ object Test { def main(args: Array[String]) { var i = 0 while(i < 10) var str = scala.util.Random.nextInt(100).toString println(str) […]

  • The implementation of implicit type conversion in Scala


    Implicit transformation in scala is a very powerful language feature, which can play two roles: I. automatic implicit conversion of some data types String type cannot be automatically converted to int type, so when a variable or constant of int type is given a value of string type, the compiler will report an error. So, […]

  • Spark series — learn Scala concurrent programming from zero


    ][6] (the article is reproduced from the official forum of black horse programmer Guangzhou center, and has been released with authorization)

  • The difference between map and flatmap in Scala


    In functional languages, functions, as first-class citizens, can be defined anywhere. Within or outside functions, functions can be combined as parameters and return values of functions. Because command programming language can also achieve higher-order functions in the way of function pointer, the most important advantage of function programming is that it can not be changed. […]