Use AWS SQS as buffer queue (go)

Time:2020-1-13

background

As a newcomer, the first task involved in the project is to add message queues for the storage of operation records. Why should we do this? The reasons are as follows: in the existing system, we directly add the user ‘s operation records to the mongodb database, but when the peak value of our system occurs, we find that mongodb can’ t stand it. Therefore, we need to achieve the peak cutting function. According to the Convention, we think of using the message queue. At the same time, because we generally use the cloud service of AWS in the project, we adopt the AWS Message queue for.

Matters needing attention

  1. AWS SQS charges according to the number of requests, so try to use batch operation
  2. AWS SQS has 12000 consumer launches, allowing up to 12000 data in transit
  3. Infinite capacity of AWS SQS
  4. The upper limit of AWS SQS batch operation is 10 pieces of data (after all, it is charged by times)
  5. In the process of parallel data retrieval of AWS SQS, there may be duplication. We use the ID of the database to remove duplication. Note that we use mongodb’s own library to generate the ID when we produce the ID. the reason is that the ID generated by mongodb is relatively uniform, and the tree structure stored in the database is more balanced and efficient

Operation steps

The basic steps of using AWS SQS are the same as those of using other message queues. The official Department of AWS SQS has given very detailed instructions. Please refer to the official documents as much as possible. Here are the simple operation steps and example codes. The codes are written in go, and other languages can refer to the official documents of go

  1. Configure connection information for AWS SQS
awsSqs := AwsSQS{}

creds := credentials.NewStaticCredentials("key", "secret", "")
sess := session.Must(session.NewSession(&aws.Config{
    Region:      aws.String("region"),
    Credentials: creds,
}))

awsSqs.svc = sqs.New(sess)
  1. Send data to AWS SQS
//Send message to queue
func (awsSqs *AwsSQS) SendMessage(record string, qURL string) *Error {
    _, err := awsSqs.svc.SendMessage(&sqs.SendMessageInput{
        MessageBody: aws.String(record),
        QueueUrl:    &qURL,
    })
    if err != nil {
        Errorf("Error Send Message to sqs: err = %v", err)
        return NewError(ErrorCodeInnerError, err.Error())
    }
    return nil
}
  1. Get data from AWS SQS
//Get message from queue
func (awsSqs *AwsSQS) ReserveMessage(qURL string) (*sqs.ReceiveMessageOutput, *Error) {
    result, err := awsSqs.svc.ReceiveMessage(&sqs.ReceiveMessageInput{
        QueueUrl:            &qURL,
        MaxNumberOfMessages: aws.Int64(10),
        WaitTimeSeconds:     aws.Int64(10),
    })

    if err != nil {
        Errorf("Error aws sqs ReceiveMessage : err=%v ", err)
        return nil, NewError(ErrorCodeInnerError, err.Error())
    }

    return result, nil
}
  1. Delete data from AWS SQS
deleteMessageList := make([]*sqs.DeleteMessageBatchRequestEntry, 0)
deleteMessage := sqs.DeleteMessageBatchRequestEntry{Id: message.MessageId, ReceiptHandle: message.ReceiptHandle}
deleteMessageList = append(deleteMessageList, &deleteMessage)
//Delete messages in the queue (bulk delete)
func (awsSqs *AwsSQS) DeleteMessage(list []*sqs.DeleteMessageBatchRequestEntry, qURL string) *Error {
    // delete message
    _, err := awsSqs.svc.DeleteMessageBatch(&sqs.DeleteMessageBatchInput{
        QueueUrl: &qURL,
        Entries:  list,
    })

    if err != nil {
        Errorf("Delete Message error:error =%v", err)
        return NewError(ErrorCodeInnerError, err.Error())
    }
    return nil
}
  1. Prevent duplication during storage to moonodb
//Customize mongodb's_id, and use mongodb's library to generate ID
id := bson.NewObjectId().Hex()
entity.id = id
type entity struct {
    Id                 string `bson:"_id,omitempty"`
}

Reference material

Official code example
AWS restriction

Recommended Today

SQL exercise 20 – Modeling & Reporting

This blog is used to review and sort out the common topic modeling architecture, analysis oriented architecture and integration topic reports in data warehouse. I have uploaded these reports to GitHub. If you are interested, you can have a lookAddress:https://github.com/nino-laiqiu/TiTanI recorded a relatively complete development process in my hexo blog deployed on GitHub. You can […]