Explanation of Common Special Symbols in Scala

Time:2019-10-5

=> (Anonymous function)

=> An anonymous function, in Spark, is also an object that can be assigned to a variable.

Spark’s anonymous function definition format:

(Parametric List) => {Function Body}

So, the function of => is to create an anonymous function instance.

For example: (x: Int) => x + 1, is equivalent to the following Java method:


public int function(int x) {
 return x+1;
}

Example:


class Symbol {
 var add = (x: Int) => x + 1
}

object test2 {
 def main (args: Array[String] ): Unit = {
 var symbol = new Symbol
 printf(""+ symbol.add.toString())
 }
}

<-(set traversal)

Loop traversal, examples are as follows:


var list = Array(1,2,3,4)
for (aa <- list) {
 printf(aa+" ")
}

The above code is similar to Java code:


int[] list = {1,2,3,4};
for(int aa : list) {
 System.out.print(aa+" ");
}

++= (String splicing)


var s:String = "a"
s+="b"
println(s)
s++="c"
println(s)

::: Three colon operators and:: Two colon operators

::: Three colon operators represent the join operation of List. (Similar to list1. addAll (list2) in Java)
:: Two colon operators represent the join operation of a common element to a list. (Similar to list1. add (A) operation in Java)

Scala operation example:


val one = List(1,2,3)
val two = List(4,5,6)
val three = one ::: two
println(three.toString())

val four = 7 :: three

println(four.toString())

-> Construct tuple and _N access tuple element N

1. The meaning of tuples in scala:

  • Tuples are lists of different types of value aggregation threads
  • Represents tuples by enclosing multiple values in parentheses

2. The difference between tuples and arrays in scala: element data types in arrays must be the same, but tuple data types can be different.

Sample program:

Val first = (1,2,3) // Definition triple

val one = 1
val two = 2
val three = one -> two

Println(three)//Construct binary tuples

Println (three. _2)// Access the second value in the binary

_ Usage of (underline)

wildcard

_ Wildcards that act like *:


import org.apache.spark.SparkContext._

Reference to each element in a set

For example, traverse elements larger than a certain value in the set filter list.


val lst = List(1,2,3,4,5)
val lstFilter = lst.filter(_ > 3)

Gets the element value of the specified subscript in the tuple


val ss = (1,"22","333")
println(ss._1)

Using pattern matching can be used to retrieve tuple members

val m = Map(1 -> 2,2 -> 4)
For ((k,) < - m) println (k) /// If all components are not needed, use for those that are not needed; in this case, only key is used, so it is used in value._

Adding default values to member variables rather than local variables


var s:Int=_
def main(args: Array[String]): Unit = {
 println(s)
}

:* As a whole, tell the compiler that you want to treat a parameter as a sequence of numbers

def main(args: Array[String]): Unit = {
 Val s = sum (1 to 5:*)// treat 1 to 5 as a sequence
 println(s)
}

def sum(args: Int*) : Int = {
 var result = 0 ;
 for(s2 <- args) {
  result += s2 ;
 }
 result ;
}

+=

Adding elements to variable arrays

val arrBuf1 = new ArrayBuffer[Int]()
ArBuf1+= 11// Add an element

println(arrBuf1)

-=

Remove the corresponding value from the map latter variable array

val arrBuf1 = new ArrayBuffer[Int]()
ArBuf1+= 11// Add an element
ArBuf1+= 12// Add an element
ArrBuf1 - = 12 /// Delete an element

println(arrBuf1)

var map = Map(1 -> 1,2 -> 2,3 ->3 )
map-=1
println(map)

The above is the whole content of this article. I hope it will be helpful to everyone’s study, and I hope you will support developpaer more.