On these days we need to access data even faster but usually to have this data can be lazy and hard so today we gonna build an app to read some data from a CSV and push it into a Redis Server to be available on a easy and fast way.
Let’s install the Redis and Commander (GUI) using docker compose:
services:
redis:
image: redis:6.2.5
command: redis-server --requirepass ${REDIS_PASSWORD}
volumes:
- redis:/var/lib/redis
- redis-config:/usr/local/etc/redis/redis.conf
ports:
- ${REDIS_PORT}:6379
networks:
- redis-network
redis-commander:
image: rediscommander/redis-commander:latest
restart: always
environment:
REDIS_HOSTS: redis
REDIS_HOST: redis
REDIS_PORT: redis:6379
REDIS_PASSWORD: ${REDIS_PASSWORD}
HTTP_USER: root
HTTP_PASSWORD: root
ports:
- 8081:8081
networks:
- redis-network
volumes:
redis:
redis-config:
networks:
redis-network:
driver: bridge
After that we already have our service running and we can check the data in Realtime using the Commander using the link: http://localhost:8081/
Now let’s put some data on our Redis using Python, we have to install the packages:
pip install redis pyarrow pandas
Importing some data to our dataframe
import redis, pandas as pd, pyarrow as pa
df = pd.read_csv('https://github.com/rondweb/python-redis-pandas/raw/main/LanguageDetection.csv', encoding='utf-8')
Let’s connect our Redis’instance:
r = redis.Redis(host='localhost', port=6379, db=0, password='xxxx')
Exporting our data from a pandas dataframe to Redis
context = pa.default_serialization_context()
r.set("data_lang", context.serialize(df).to_buffer().to_pybytes())
Just done! now have our data accessible by redis to any consumer by redis
df = context.deserialize(r.get("data_lang"))