# Golang SDK User Guide
This guide will show you how to access your projects using the Golang SDK. You can get the source code of Golang SDK at GitHub (opens new window).
**The latest version is **: 1.5.0 .
**Update time is **: 2021- 11 - 02
# I. Integrate and Initialize SDK
# 1.1 Integrate SDK
Execute the following command to get the latest version of Golang SDK.
# acquire SDK
go get github.com/ThinkingDataAnalytics/go-sdk/thinkingdata
# update SDK
go get -u github.com/ThinkingDataAnalytics/go-sdk/thinkingdata
Module mode:
//Introduce thinkingdata at the beginning of the code file
import "github.com/ThinkingDataAnalytics/go-sdk/thinkingdata"
# Pull out the latest SDK module
go mod tidy
# 1.2 Initialize SDK
First introduce thinkingdata
at the beginning of the code file:
// import thinkingdata sdk
import "github.com/ThinkingDataAnalytics/go-sdk/thinkingdata"
To upload data using the SDK, you first create an instance of TDAnalytics
. Creating a TDAnalytics
instance requires passing in a structure that implements the Consumer
interface. Consumer
is defined as follows:
// Consumer implements IO operations for data (write to disk or send to receiver)
type Consumer interface {
Add(d Data) error
Flush() error
Close() error
}
The thinkingdata
package provides three implementations of Consumer
:
**(1) LogConsumer **: Write data to local files in real time, the files are divided in days/hours, and need to be used with LogBus for data upload
// Create a Day-Divided LogConsumer without setting a single log upper limit
consumer, err := thinkingdata.NewLogConsumer("/path/to/data", thinkingdata.ROTATE_DAILY)
// Create a LogConsumer split by hour without setting a single log upper limit
consumer, err := thinkingdata.NewLogConsumer("/path/to/data", thinkingdata.ROTATE_HOURLY)
// Create a Day-Divided LogConsumer and set a single log file cap of 10 G
consumer, err := thinkingdata.NewLogConsumerWithFileSize("/path/to/data", thinkingdata.ROTATE_DAILY, 10 * 1024)
// Specify the generated file prefix
config := LogConfig{
Directory: "/path/to/data",
RotateMode: thinkingdata.ROTATE_DAILY,
FileNamePrefix: "prefix",
}
consumer, err := thinkingdata.NewLogConsumerWithConfig(config)
The incoming parameter is the address of the local folder written to. You only need to set the address of the listening folder of LogBus to the address here, and you can use LogBus to monitor and upload data.
**(2) BatchConsumer **: batch real-time transmission of data to the TA server, do not need to match the transmission tool, in the case of long-term network outage, there is a risk of data loss**.**
// Create BatchConsumer, specify receiver address, APP ID
consumer, err := thinkingdata.NewBatchConsumer("SERVER_URL", "APP_ID")
// Create BatchConsumer, set data uncompression, default gzip compression, intranet transmission
consumer, err := thinkingdata.NewBatchConsumerWithCompress("SERVER_URL", "APP_ID",false)
SERVER_URL
For the URL of the data transfer, APP_ID
For the APP ID of your project
If you are using Cloud as a Service, enter the following URL:
http://receiver.ta.thinkingdata.cn
If you are using the version of private deployment, enter the following URL:
http://Data Acquisition Address
Note: Enter the following URL before version 1.1.0:
http://receiver.ta.thinkingdata.cn/logagent
http://Data Acquisition Address/logagent
BatchConsumer will first store the data in the buffer, and when the number of data pieces exceeds the set value (batchSize, default is 20), the report will be triggered. You can also specify batchSize when creating BatchConsumer:
// Create BatchConsumer with specified receiver address, APP ID, buffer size in M
consumer, err := thinkingdata.NewBatchConsumerWithBatchSize("SERVER_URL", "APP_ID", 50)
**(3) DebugConsumer **: Transmit data to the TA server one by one in real time, and return detailed error messages when the data format is wrong. It is recommended to use DebugConsumer validation data format first, not in production environment.
consumer, _ := thinkingdata.NewDebugConsumer("SERVER_URL", "APP_ID")
If you do not want the data to be stored, but only want to verify the data format, you can initialize the code as follows:
//Default is true, representing stored
consumer, _ := thinkingdata.NewDebugConsumerWithWriter("SERVER_URL", "APP_ID",false)
SERVER_URL
For the URL of the data transfer, APP_ID
For the APP ID of your project
If you are using Cloud as a Service, enter the following URL:
http://receiver.ta.thinkingdata.cn
If you are using the version of private deployment, enter the following URL:
http://Data Acquisition Address
# 1.3 Create an SDK Instance
Pass in the created Consumer
to get the corresponding TDAnalytics instance:
ta, err := thinkingdata.New(consumer)
Then you can use TA's interface to report the data.
# II. Report Data
After the SDK initialization is completed, you can call track
to upload events. In general, you may need to upload more than a dozen to hundreds of different events. If you are using the TA background for the first time, we recommend You upload a few key events first.
If you have doubts about what kind of events you need to send, you can check the Quick Use Guide for more information.
# 2.1 Send Events
You can call track
to upload events. It is recommended that you set the attributes of the event and the conditions for sending information according to the previously combed doc:
// Set Event Properties
properties := map[string]interface{}{
// System preset properties, optional. "#time" property is the system preset property which is passed into time. Time objects can also be uploaded with a TA-compliant time string indicating when the event occurred
// If this property is not populated, the current system time is used by default
//"#time": time.Now().UTC(),
//"#time":"2020-02-02 11:49:43.222",
// System preset properties, optional. If the user IP address can be obtained in the server and the attribute can be filled in
// thinkingdata automatically parses the user's province and city information based on the IP address
"#ip": "123.123.123.123",
// User-defined properties, string type
"prop_string": "abcdefg",
// User-defined properties, value types
"prop_num": 56.56,
// User-defined properties, bool type
"prop_bool": true,
// User-defined properties, time. Time type
"prop_date": time.Now(),
}
account_id := "user_account_id" // account ID
distinct_id := "ABCDEF123456AGDCDD" // acount ID
// The reported event is named TEST_ Event at EVENT. Account_ ID and distinct_ ID cannot be empty at the same time
ta.Track(account_id, distinct_id, "TEST_EVENT", properties)
**Note: **In order to ensure that the guest ID and account ID can be bound smoothly, if you will use the guest ID and account ID in your game, we strongly recommend that you upload the two IDs at the same time, otherwise the account will not match, resulting in repeated user calculations. For specific ID binding rules, please refer to the chapter User Identification Rules .
- The name of the event can only start with a letter and can contain numbers, letters and an underscore '_'. The length is up to 50 characters and is not sensitive to letter case.
- The attributes of the event are of type
map
, where each element represents an attribute - The key value of the event property is the name of the property, which is of
string
type. It is stipulated that it can only start with letters, including numbers, letters and underscore '_'. The maximum length is 50 characters, which is not sensitive to letter case. - The value of the event property is the value of the property, supporting
string
,numeric type
,bool
, andtime. Time
andarray types
# 2.2 Set Public Event Properties
For some properties that need to appear in all events, you can call SetSuperProperties
to set the public event properties. We recommend that you set the public event properties before sending the event.
// Set Common Event Properties
ta.SetSuperProperties(map[string]interface{}{
"SUPER_TIME": time.Now(),
"SUPER_BOOL": true,
"SUPER_STRING": "hello",
"SUPER_NUM": 15.6,
})
- The public event attribute is of type
map
, where each element represents an attribute. - The Key value of the public event property is the name of the property, which is of
string
type. It is stipulated that it can only start with letters, contain numbers, letters and underscore "_", and is up to 50 characters in length. It is not sensitive to letter case. - The Value value of the public event property is the value of the property, supporting
string
,numeric type
,bool
, andtime. Time
andarray types
Setting public attributes is equivalent to setting the above attributes for all events. If the attributes in the event have the same name as the public attributes, the event attributes of the piece of data will overwrite the public event attributes with the same name. If the property with the same name does not exist, add the property and you can get the current public event property through the interface:
currentSuperProperties := ta.GetSuperProperties()
You can call ClearSuperProperties
to clear all public properties that have been set.
ta.ClearSuperProperties()
# III. User Attributes
TA platform currently supports the user feature setting interface for UserSet
, UserSetOnce
, UserAdd
, UserDelete
, UserUnset
and UserAppend
.
# 3.1 UserSet
For general user features, you can call the UserSet
to set them. The properties uploaded using this interface will overwrite the user's original user feature. If the user feature does not exist before, the new user feature will be created:
ta.UserSet(account_id, distinct_id, map[string]interface{}{
"USER_STRING": "some message",
"USER_DATE": time.Now(),
})
//Upload user properties again and the user's "USER_STRING" is overwritten with "other message"
ta.UserSet(account_id, distinct_id, map[string]interface{}{
"USER_STRING": "another message"
})
The user feature
set by the UserSet is amap
type where each element represents an attribute.- User feature
Key
is the attribute name, which is ofstring
type. It is stipulated that it can only start with letters, including numbers, letters and underscore "_". The maximum length is 50 characters and is not sensitive to letter case. - Property values support five types:
string
,numeric type
,bool
, andtime. Time
andarray types
# 3.2 UserSetOnce
If you want the uploaded user feature to be set only once, you can call UserSetOnce
to set it. When the attribute already has a value before, this information will be ignored:
ta.UserSetOnce(account_id, distinct_id, map[string]interface{}{
"USER_STRING": "some message",
"USER_DATE": time.Now(),
})
//Upload user properties again, and the user's "USER_STRING" is still "some message"
ta.UserSetOnce(account_id, distinct_id, map[string]interface{}{
"USER_STRING": "another message"
})
//Upload user properties using UserSet again, and the user's "USER_STRING" will be overwritten as "other message"
ta.UserSet(account_id, distinct_id, map[string]interface{}{
"USER_STRING": "other message"
})
UserSetOnce
sets the same user feature type and restrictions as UserSet
.
# 3.3 UserAdd
When you want to upload a numeric attribute, you can call UserAdd
to accumulate the attribute. If the attribute has not been set, a value of 0 will be assigned before calculation. Negative values can be passed in, which is equivalent to subtraction operations.
ta.UserAdd(account_id, distinct_id, map[string]interface{}{
"Amount": 50,
})
//Upload user properties again, and the user's "Amount" is now 80
ta.UserAdd(account_id, distinct_id, map[string]interface{}{
"Amount": 30,
})
UserAdd
sets the same user feature type and restrictions as the UserSet
, but only for numeric user features.
# 3.4 UserDelete
If you want to delete a user, you can call UserDelete
to delete the user. You will no longer be able to query the user features of the user, but the events generated by the user can still be queried.
ta.UserDelete(account_id, distinct_id)
# 3.5 UserUnset
When you need to empty the value of a user's user feature, you can call UserUnset
to empty:
// Empty a user property of a user and pass in the property name in the parameter
ta.UserUnset(account_id, distinct_id, property_name)
# 3.6 UserAppend
When you want to append the user feature value to an array, type, you can call user_append
to append the specified attribute, if the attribute has not been created in the cluster, then create the attribute user_append
//User array type adds the attribute UserAppend, which adds the following attributes to the following two array types and only supports key - [] string
err = ta.UserAppend(account_id, distinct_id, map[string]interface{}{
"array": []string{"str1","str2"},
"arrkey1":[]string{"str3","str4"},
})
if err != nil {
fmt.Println("user add failed", err)
}
# IV. Other Operations
# 4.1 Submit Data Immediately
This operation is related to the specific Consumer implementation. When receiving data, the Consumer can first store the data in a buffer and trigger a real data IO operation under certain circumstances to improve overall performance. In some cases, you need to submit data immediately, you can call the Flush interface.
// Submit data immediately to the appropriate receiver
ta.Flush()
# 4.2 Close sdk
BatchConsumer must execute the Close method before the server shuts down or the sdk exits, otherwise some data may be lost.
// Close and exit SDK
ta.Close()
Close and exit sdk, please call this interface before shutting down the server to avoid data loss in the cache**.**
# V. Relevant Preset Attributes
# 5.1 Preset Properties for All Events
The following preset properties are the preset properties that all events in the Go SDK (including auto-collection events) will carry.
Attribute name | Chinese name | Description |
---|---|---|
#ip | IP address | The user's IP address needs to be manually set, and TA will use this to obtain the user's geographic location information |
#country | Country | User's country, generated according to IP address |
#country_code | Country code | The country code of the country where the user is located (ISO 3166-1 alpha-2, that is, two uppercase English letters), generated based on the IP address |
#province | Province | The user's province is generated according to the IP address |
#city | City | The user's city is generated according to the IP address |
# lib | SDK type | The type of SDK you access, such as tga_go_sdk, etc |
#lib_version | SDK version | The version you access to the Go SDK |
# VI. Advanced Functions
Starting with v1.2.0, the SDK supports the reporting of two special types of events: updatable events and rewritable events. These two events need to be used in conjunction with TA system 2.8 and later versions. Since special events are only applicable in certain specific scenarios, please use special events to report data with the help of Count Technology's customer success and analysts.
# 6.1 Updatable Events
You can implement the need to modify event data in a specific scenario through updatable events. Updatable events need to specify an ID that identifies the event and pass it in when you create an updatable event object. The TA background will determine the data that needs to be updated based on the event name and event ID.
// Instance: Report an event that can be updated, assuming the event name is UPDATABLE_ EVENT
proterties := make(map[string]interface{})
properties["status"] = 3
properties["price"] = 100
consumer, _ := thinkingdata.NewBatchConsumer("url", "appid")
ta := thinkingdata.New(consumer)
// Event attributes after reporting are status 3 and price 100
ta.TrackUpdate("account_id", "distinct_id", "UPDATABLE_EVENT", "test_event_id", properties))
proterties_new := make(map[string]interface{})
proterties_new["status"] = 5
// Event attributes status is updated to 5 after reporting, price remains unchanged
ta.TrackUpdate("account_id", "distinct_id", "UPDATABLE_EVENT", "test_event_id", proterties_new))
# 6.2 Rewritable Events
Rewritable events are similar to updatable events, except that rewritable events will completely cover historical data with the latest data, which is equivalent to deleting the previous data and storing the latest data in effect. The TA background will determine the data that needs to be updated based on the event name and event ID.
// Instance: Report an event that can be overridden, assuming the event name is OVERWRITE_ EVENT
proterties := make(map[string]interface{})
properties["status"] = 3
properties["price"] = 100
consumer, _ := thinkingdata.NewBatchConsumer("url", "appid")
ta := thinkingdata.New(consumer)
// Event attributes after reporting are status 3 and price 100
ta.TrackOverwrite("account_id", "distinct_id", "OVERWRITE_EVENT", "test_event_id", properties))
proterties_new := make(map[string]interface{})
proterties_new["status"] = 5
// Event attributes status is updated to 5 and price attributes are deleted after reporting
ta.TrackOverwrite("account_id", "distinct_id", "OVERWRITE_EVENT", "test_event_id", proterties_new))
# ChangeLog
# v1.5.0 2021/11/02
- Added support for complex structure types
# v1.4.0 2021/05/10
- BatchConsumer optimization: Increase the cache to cache data when the network connection is interrupted
# v1.3.0 2020/11/25
- LogConsumer optimization: support for automatic directory creation
- Optimization: add automatic upload function
# v1.2.0 2020/08/24
- Supports updatable and rewritable events
# v1.1.1 2020/07/08
- The field #time supports uploading strings that conform to TA format
- Remove the limit of field 2k size
# v1.1.0 2020/02/12
- Support reporting array types
- Support UserAppend interface
- DebugConsumer Optimization: More complete and accurate validation of data at the server level
- BatchConsumer performance optimization: support for configuring compression mode; remove Base64 encoding
# v1.0.2 2019/12/25
- Support UserUnset interface
# v1.0.1 2019/12/12
- Fixed empty property values not being written to the log
# v1.0.0 2019/09/25
- Implemented the core functions of data reporting
- Track: Track user behavior events
- Public event property settings
- User feature settings: UserSet, UserSetOnce, UserAdd, UserDelete
- Support: LogConsumer, DebugConsumer, BatchConsumer