Hello, i have a fargate eks question. I was in th...
# prefect-community
r
Hello, i have a fargate eks question. I was in the verge of deploying my prefect file into fargate. I'm using redis as a temporary store. I created a redis elastic cache on AWS and im having trouble connecting to it via kubernetees. Mainly getting a
Copy code
temporary failure in name resolution
I'm not even sure whom to ask this, but since i was using prefect i figured someone here knows about this. I raised it here, https://stackoverflow.com/questions/71864208/unable-to-connect-to-redis-elasticcache-from-fargate, curious if someone had any suggestions? My fargate profile has same 4 subnets that my cluster in elastic cache has. they also have the same security group.
a
Why Redis? 😄 we discussed this together, and Timestream is so much better suited to your use case. You're dealing with time-series data. Redis has a plugin for time series, but it's not available in the ElastiCache because ElastiCache is an AWS "fork" of Redis (or a plain version with no custom plugins), and RedisLabs afaik only supports this time-series plugin in their product only I honestly really think you would be doing yourself a huge favor when switching to Timestream. ElastiCache is a pain to manage - I can't help in connection issues because it was set up by DevOps for me in the past, but nobody was happy with it 😂 it seems you are having the same experience
👍 1
r
ugh yea you are right. its literally a pain to integrate redis elastic cache with fargate. i just switched to timestream. Question, any idea how to write data into timestream with multiple fields? example: timestamp | symbol | bids | asks 122455 | BTC | [22.33, 44.55, ...] | [55.66, 77.28] My bids and asks fields are list of prices at that timestamp. I mean i converted it to a string. Is this possible in timestream? At the moment i have it as, record = { "Time": str(int(round(time.time() * 1000))), "TimeUnit": "MILLISECONDS", "Dimensions": [{"Name": "symbol", "Value": "BTC"}], "MeasureName": "asks", "MeasureValue": str(records[0]['asks']), "MeasureValueType": "VARCHAR", }, ``````
a
It's a matter of how you format the records. Here is how you can do that
Copy code
records = [
    {
        "Time": now,
        "TimeUnit": "MILLISECONDS",
        "Dimensions": [{"Name": "crypto", "Value": "BTC"}],
        "MeasureName": "Price",
        "MeasureValue": str(btc),
        "MeasureValueType": "DOUBLE",
    },
    {
        "Time": now,
        "TimeUnit": "MILLISECONDS",
        "Dimensions": [{"Name": "crypto", "Value": "ETH"}],
        "MeasureName": "Price",
        "MeasureValue": str(eth),
        "MeasureValueType": "DOUBLE",
    },
    {
        "Time": now,
        "TimeUnit": "MILLISECONDS",
        "Dimensions": [{"Name": "crypto", "Value": "DASH"}],
        "MeasureName": "Price",
        "MeasureValue": str(dash),
        "MeasureValueType": "DOUBLE",
    },
    {
        "Time": now,
        "TimeUnit": "MILLISECONDS",
        "Dimensions": [{"Name": "crypto", "Value": "REP"}],
        "MeasureName": "Price",
        "MeasureValue": str(rep),
        "MeasureValueType": "DOUBLE",
    },
]
and then to write the records:
Copy code
import boto3

write_client = boto3.client("timestream-write")
rejected_records = write_client.write_records(
    DatabaseName="demo", TableName="data", Records=records, CommonAttributes={}
)
print(rejected_records)
this blog post shows more details, and this one has a full example incl. a guide how to build a Grafana dashboards for that 🙂
❤️ 1
r
hi anna, thanks. so i managed to push everything to timestream. But after 3 minutes, im getting an error, 'the dimension name and value exceeds the maximum supported length for dimension names and values. see quotas in the timestream developer guide for additional information?
a
I think you know the answer, right? 😄 use shorter names for the dimensions. AWS docs for Timestream are quite good, you can look up all the details there
r
oh i think its my value. I'm storing a list of list as a string in there. Is that not allowed?
a
I don't know, I'm afk, can you check Timestream docs?:)