Creating an activity stream using Postgres 15

Creating an activity stream using Postgres 15

An activity stream is a data format used to represent a list of recent activities performed by an individual or group on a social network, web application, or other platform. It typically includes information such as the type of activity (e.g., posting a status update, commenting on a post), the person or entity performing the activity, and any associated objects or targets (e.g., a photo or link). Activity streams can be used to track user behavior, personalize content recommendations, and facilitate social interactions between users.

An activity stream typically consists of the following parts:

  1. Actor: The person or entity that initiates the action.
  2. Verb: The action being taken.
  3. Object: The thing on which the action is taken.
  4. Target: The thing to which the action is directed.
  5. Time: The time at which the action occurred.
  6. Context: Any additional information about the action, such as the location or device used to perform it.
  7. Metadata: Additional information about the action, such as the user’s preferences or the permissions required to perform it.

Activity streams can be used to represent data from any system, and there is no direct relationship between the stream of activities and the associated objects.

With a basic understanding of what an activity stream is, we can leverage PostgreSQL as a database storage to implement one. PostgreSQL is particularly suitable for activity streams due to its built-in support for JSON columns, which can store data with flexible schemas, and its GIS functionality, which makes it easy to filter activities based on location.

For this project, I have chosen to use Postgres 15 with GIS extensions, as well as the DBeaver Community Edition for managing the database. The GIS extensions are especially useful for this project since we want to display only activities that occurred around specific geographical points

Let’s begin our coding journey with the creation of an object storage in PostgreSQL. The object storage will have a column to store the object type and a JSON column to store the complete data of the object being stored.

CREATE DATABASE ActivityStream;

After creating the database, the next step is to install the PostGIS extension using the following query.

CREATE EXTENSION IF NOT EXISTS postgis; -- Enable PostGIS extension

 

CREATE TABLE objectstorage (
    id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
    latitude DECIMAL(9,6) NOT NULL,
    longitude DECIMAL(9,6) NOT NULL,
    location GEOMETRY(Point, 4326), -- 4326 is the SRID for WGS 84, a common coordinate system for GPS data
    object_type TEXT NOT NULL,
    object_data JSONB NOT NULL,
    created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
    updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
);


CREATE OR REPLACE FUNCTION update_location() RETURNS TRIGGER AS $$
BEGIN
    NEW.location := ST_SetSRID(ST_MakePoint(NEW.longitude, NEW.latitude), 4326);
    RETURN NEW;
END;
$$ LANGUAGE plpgsql;


CREATE TRIGGER set_location
    BEFORE INSERT OR UPDATE
    ON objectstorage
    FOR EACH ROW
    EXECUTE FUNCTION update_location();

This query creates a table named objectstorage with columns for id, latitude, longitude, location, object_type, object_data, created_at, and updated_at. The id column is a primary key and generates a random UUID as its default value. The latitude and longitude columns store decimal values for geographic coordinates. The location column stores a geometry object of type Point using the WGS 84 coordinate system with SRID 4326. The object_type column stores the type of the object being stored, and the object_data column stores the complete data for the object in JSONB format. The created_at and updated_at columns store timestamps for when the row was created and last updated, respectively.

Additionally, this query creates a trigger function named update_location() that is triggered when a row is inserted or updated in the objectstorage table. The function updates the location column based on the values in the latitude and longitude columns using the ST_SetSRID() and ST_MakePoint() functions from PostGIS. The ST_SetSRID() function sets the coordinate system for the point, and the ST_MakePoint() function creates a point geometry object from the latitude and longitude values. The function returns the updated row.

To simplify our database interactions, we’ll create UPSERT functions as needed. Here’s an example of an UPSERT function we can use for the objectstorage table.

CREATE OR REPLACE FUNCTION upsert_objectstorage(
    p_id UUID, 
    p_latitude DECIMAL(9,6), 
    p_longitude DECIMAL(9,6),
    p_object_type TEXT,
    p_object_data JSONB
) RETURNS VOID AS $$
BEGIN
    -- Try to update the existing row
    UPDATE objectstorage SET
        latitude = p_latitude,
        longitude = p_longitude,
        location = ST_SetSRID(ST_MakePoint(p_longitude, p_latitude), 4326),
        object_type = p_object_type,
        object_data = p_object_data,
        updated_at = CURRENT_TIMESTAMP
    WHERE id = p_id;
    
    -- If no row was updated, insert a new one
    IF NOT FOUND THEN
        INSERT INTO objectstorage (id, latitude, longitude, location, object_type, object_data)
        VALUES (p_id, p_latitude, p_longitude, ST_SetSRID(ST_MakePoint(p_longitude, p_latitude), 4326), p_object_type, p_object_data);
    END IF;
END;
$$ LANGUAGE plpgsql;

Below is the code for the “activity” table, which is the central piece of an activity stream system. It includes a trigger function that updates the “location” column using PostGIS.

CREATE TABLE activity (
    id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
    verb TEXT NOT NULL,
    actor_id UUID NOT NULL REFERENCES objectstorage(id),
    object_id UUID NOT NULL REFERENCES objectstorage(id),
    target_id UUID REFERENCES objectstorage(id),
    latitude DECIMAL(9,6) NOT NULL,
    longitude DECIMAL(9,6) NOT NULL,
    location GEOMETRY(Point, 4326) NOT NULL,
    created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
    updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
);

CREATE OR REPLACE FUNCTION update_activity_location() RETURNS TRIGGER AS $$
BEGIN
    NEW.location := ST_SetSRID(ST_MakePoint(NEW.longitude, NEW.latitude), 4326);
    RETURN NEW;
END;
$$ LANGUAGE plpgsql;

CREATE TRIGGER set_activity_location
    BEFORE INSERT OR UPDATE
    ON activity
    FOR EACH ROW
    EXECUTE FUNCTION update_activity_location();

 

Now the UPSERT function for the activity table

 

CREATE OR REPLACE FUNCTION upsert_activity(
    p_id UUID,
    p_verb TEXT,
    p_actor_id UUID,
    p_object_id UUID,
    p_target_id UUID,
    p_latitude DECIMAL(9,6),
    p_longitude DECIMAL(9,6)
) RETURNS VOID AS $$
BEGIN
    -- Try to update the existing row
    UPDATE activity SET
        verb = p_verb,
        actor_id = p_actor_id,
        object_id = p_object_id,
        target_id = p_target_id,
        latitude = p_latitude,
        longitude = p_longitude,
        location = ST_SetSRID(ST_MakePoint(p_longitude, p_latitude), 4326),
        updated_at = CURRENT_TIMESTAMP
    WHERE id = p_id;
    
    -- If no row was updated, insert a new one
    IF NOT FOUND THEN
        INSERT INTO activity (id, verb, actor_id, object_id, target_id, latitude, longitude, location)
        VALUES (p_id, p_verb, p_actor_id, p_object_id, p_target_id, p_latitude, p_longitude, ST_SetSRID(ST_MakePoint(p_longitude, p_latitude), 4326));
    END IF;
END;
$$ LANGUAGE plpgsql;



To avoid serialization issues and redundant code, we’ll modify our queries to return JSON arrays. We’ll add a new column named “self” to the activity table and create a trigger that saves the current activity values in JSON format.

 

ALTER TABLE activity ADD COLUMN self JSON;

CREATE OR REPLACE FUNCTION update_activity_self() RETURNS TRIGGER AS $$
BEGIN
    NEW.self = json_build_object(
        'id', NEW.id,
        'verb', NEW.verb,
        'actor_id',NEW.actor_id,
        'actor', (SELECT object_data FROM objectstorage WHERE id = NEW.actor_id),
        'object_id',NEW.object_id,
        'object', (SELECT object_data FROM objectstorage WHERE id = NEW.object_id),
        'target_id',NEW.target_id,
        'target', (SELECT object_data FROM objectstorage WHERE id = NEW.target_id),
        'latitude', NEW.latitude,
        'longitude', NEW.longitude,
        'created_at', NEW.created_at,
        'updated_at', NEW.updated_at
    )::jsonb;
    RETURN NEW;
END;
$$ LANGUAGE plpgsql;

CREATE TRIGGER activity_self_trigger
    BEFORE INSERT OR UPDATE ON activity
    FOR EACH ROW
    EXECUTE FUNCTION update_activity_self();

CREATE OR REPLACE FUNCTION get_activities_by_distance_as_json(
    p_lat NUMERIC,
    p_long NUMERIC,
    p_distance INTEGER,
    p_page_num INTEGER,
    p_page_size INTEGER
) 
RETURNS JSON
AS $$
DECLARE
    activities_json JSON;
BEGIN
    SELECT json_agg(a.self) INTO activities_json
    FROM (
        SELECT a.self
        FROM activity a
        WHERE ST_DWithin(location::geography, ST_SetSRID(ST_Point(p_long, p_lat), 4326)::geography, p_distance)
        ORDER BY created_at DESC
        LIMIT p_page_size
        OFFSET (p_page_num - 1) * p_page_size
    ) a;
    
    RETURN activities_json;
END;
$$ LANGUAGE plpgsql;

 

An activity stream without a follow functionality would defeat the main purpose of an activity stream, which is to keep track of the activities of other actors without the need to constantly visit their profile page.

So here is the code for the follow functionality

CREATE TABLE follow (
    id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
    follower_id UUID NOT NULL REFERENCES objectstorage(id),
    followee_id UUID NOT NULL REFERENCES objectstorage(id),
    created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
    updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
);

CREATE OR REPLACE FUNCTION follow_user(
    p_follower_id UUID,
    p_followee_id UUID
) RETURNS VOID AS $$
BEGIN
    -- Try to insert a new row into the follow table
    -- If the row already exists, do nothing
    BEGIN
        INSERT INTO follow (follower_id, followee_id)
        VALUES (p_follower_id, p_followee_id);
    EXCEPTION WHEN unique_violation THEN
        RETURN;
    END;
END;
$$ LANGUAGE plpgsql;

CREATE OR REPLACE FUNCTION unfollow_user(
    p_follower_id UUID,
    p_followee_id UUID
) RETURNS VOID AS $$
BEGIN
    -- Delete the row from the follow table where the follower_id and followee_id match
    DELETE FROM follow
    WHERE follower_id = p_follower_id AND followee_id = p_followee_id;
END;
$$ LANGUAGE plpgsql;

 

To create an activity stream, we need to first identify the actors that we are following. To accomplish this, we can define a function that takes an ID of an object from our object storage and retrieves the IDs of all the actors that are being followed by that object.

Here’s the function code:

CREATE OR REPLACE FUNCTION get_following_ids(p_user_id UUID)
RETURNS UUID[] AS $$
DECLARE
  following_ids UUID[];
BEGIN
  SELECT ARRAY_AGG(followee_id) INTO following_ids
  FROM follow
  WHERE follower_id = p_user_id;
  
  RETURN following_ids;
END;
$$ LANGUAGE plpgsql;


 

Now that we have obtained the list of actors we are following, the next step is to retrieve their activities. However, this can be a challenging task due to two reasons: first, using a relational database could result in complex joins that could slow down the data retrieval process; second, the actors we are following might have produced a large number of activities, and retrieving them all at once could potentially overload the server. To address these issues, we will introduce pagination to our queries to ensure efficient and scalable data retrieval.

 

CREATE OR REPLACE FUNCTION get_activities_by_following(p_page_num INTEGER, p_page_size INTEGER, p_following_ids UUID[])
    RETURNS TABLE (
        id UUID,
        verb TEXT,
        actor_id UUID,
        object_id UUID,
        target_id UUID,
        latitude DECIMAL(9,6),
        longitude DECIMAL(9,6),
        location GEOMETRY(Point, 4326),
        self_data JSON,
        created_at TIMESTAMP WITH TIME ZONE,
        updated_at TIMESTAMP WITH TIME ZONE
    ) AS $$
BEGIN
    RETURN QUERY
    SELECT a.id, a.verb, a.actor_id, a.object_id, a.target_id, a.latitude, a.longitude, a.location, a."self" , a.created_at, a.updated_at
    FROM activity a
    WHERE a.actor_id = ANY(p_following_ids)
    ORDER BY a.created_at DESC
    LIMIT p_page_size
    OFFSET (p_page_num - 1) * p_page_size;
END;
$$ LANGUAGE plpgsql;

 

We need a function that takes the result produced by the get_activities_by_following function, and converts it into a JSON array.

 

CREATE OR REPLACE FUNCTION get_activities_by_following_as_json(p_page_num INTEGER, p_page_size INTEGER, p_user_id UUID)
RETURNS JSON AS $$
DECLARE
    following_ids UUID[] := ARRAY(SELECT get_following_ids(p_user_id));
BEGIN
    RETURN (SELECT json_agg(self_data) FROM get_activities_by_following(p_page_num, p_page_size, following_ids));
END;
$$ LANGUAGE plpgsql;

 

To demonstrate our activity stream system, we need to create sample data. Let’s create 5 users and have them post ads on our objectstorage table.

 

--create users and activities

SELECT upsert_objectstorage(
    'b8dcbf13-cb01-4a35-93d5-5a5f5a2f6c01', -- object ID 1
    59.9311, -- latitude
    30.3609, -- longitude
    'user', -- object type
    '{"name": "Alice", "age": 27, "email": "alice@example.com", "picture_url": "https://example.com/pictures/alice.jpg"}' -- object data in JSON format
);
SELECT upsert_objectstorage(
    'cc7ebda2-019c-4387-925c-352f7e1f0b10', -- object ID 2
    59.9428, -- latitude
    30.3071, -- longitude
    'user', -- object type
    '{"name": "Bob", "age": 33, "email": "bob@example.com", "picture_url": "https://example.com/pictures/bob.jpg"}' -- object data in JSON format
);

SELECT upsert_objectstorage(
    '99875f15-49ee-4e6d-b356-cbab4f4e4a4c', -- object ID 3
    59.9375, -- latitude
    30.3086, -- longitude
    'user', -- object type
    '{"name": "Charlie", "age": 42, "email": "charlie@example.com", "picture_url": "https://example.com/pictures/charlie.jpg"}' -- object data in JSON format
);

SELECT upsert_objectstorage(
    '34f6c0a5-5d5e-463f-a2cf-11b7529a92a1', -- object ID 4
    59.9167, -- latitude
    30.25, -- longitude
    'user', -- object type
    '{"name": "Dave", "age": 29, "email": "dave@example.com", "picture_url": "https://example.com/pictures/dave.jpg"}' -- object data in JSON format
);

SELECT upsert_objectstorage(
    '8d7685d5-5b1f-4a7a-835e-b89e7d3a3b54', -- object ID 5
    59.9391, -- latitude
    30.3158, -- longitude
    'user', -- object type
    '{"name": "Eve", "age": 25, "email": "eve@example.com", "picture_url": "https://example.com/pictures/eve.jpg"}' -- object data in JSON format
);

--create ads

-- Bob's ad
SELECT upsert_objectstorage(
'f6c7599e-8161-4d54-82ec-faa13bb8cbf7', -- object ID
59.9428, -- latitude (near Saint Petersburg)
30.3071, -- longitude (near Saint Petersburg)
'ad', -- object type
'{"description": "Vintage bicycle, good condition", "ad_type": "sale", "picture_url": "https://example.com/pictures/bicycle.jpg"}' -- object data in JSON format
);

SELECT upsert_activity(
gen_random_uuid(), -- activity ID
'post', -- verb
'cc7ebda2-019c-4387-925c-352f7e1f0b10', -- actor ID (Bob)
'f6c7599e-8161-4d54-82ec-faa13bb8cbf7', -- object ID (Bob's ad)
NULL, -- target ID (no target)
59.9428, -- latitude (near Saint Petersburg)
30.3071 -- longitude (near Saint Petersburg)
);

-- Charlie's ad
SELECT upsert_objectstorage(
'41f7c558-1cf8-4f2b-b4b4-4d4e4df50843', -- object ID
59.9375, -- latitude (near Saint Petersburg)
30.3086, -- longitude (near Saint Petersburg)
'ad', -- object type
'{"description": "Smartphone, unlocked", "ad_type": "sale", "picture_url": "https://example.com/pictures/smartphone.jpg"}' -- object data in JSON format
);

SELECT upsert_activity(
gen_random_uuid(), -- activity ID
'post', -- verb
'99875f15-49ee-4e6d-b356-cbab4f4e4a4c', -- actor ID (Charlie)
'41f7c558-1cf8-4f2b-b4b4-4d4e4df50843', -- object ID (Charlie's ad)
NULL, -- target ID (no target)
59.9375, -- latitude (near Saint Petersburg)
30.3086 -- longitude (near Saint Petersburg)
);


-- Dave's ad
SELECT upsert_objectstorage(
'c3dd7b47-1bba-4916-8a6a-8b5f2b50ba88', -- object ID
59.9139, -- latitude (near Saint Petersburg)
30.3341, -- longitude (near Saint Petersburg)
'ad', -- object type
'{"description": "Vintage camera, working condition", "ad_type": "exchange", "picture_url": "https://example.com/pictures/camera.jpg"}' -- object data in JSON format
);

SELECT upsert_activity(
gen_random_uuid(), -- activity ID
'post', -- verb
'34f6c0a5-5d5e-463f-a2cf-11b7529a92a1', -- actor ID (Dave)
'c3dd7b47-1bba-4916-8a6a-8b5f2b50ba88', -- object ID (Dave's ad)
NULL, -- target ID (no target)
59.9139, -- latitude (near Saint Petersburg)
30.3341 -- longitude (near Saint Petersburg)
);

-- Eve's ad
SELECT upsert_objectstorage(
'3453f3c1-296d-47a5-a5a5-f5db5ed3f3b3', -- object ID
59.9375, -- latitude (near Saint Petersburg)
30.3141, -- longitude (near Saint Petersburg)
'ad', -- object type
'{"description": "Plants, various types", "ad_type": "want", "picture_url": "https://example.com/pictures/plants.jpg"}' -- object data in JSON format
);

SELECT upsert_activity(
gen_random_uuid(), -- activity ID
'post', -- verb
'8d7685d5-5b1f-4a7a-835e-b89e7d3a3b54', -- actor ID (Eve)
'3453f3c1-296d-47a5-a5a5-f5db5ed3f3b3', -- object ID (Eve's ad)
NULL, -- target ID (no target)
59.9375, -- latitude (near Saint Petersburg)
30.3141 -- longitude (near Saint Petersburg)
);

-- Alice's ad
SELECT upsert_objectstorage(
    'b8dcbf13-cb01-4a35-93d5-5a5f5a2f6c02', -- new object ID for Alice's ad
    59.9311, -- latitude
    30.3609, -- longitude
    'ad', -- object type
    '{"description": "Used bicycle, good condition", "ad_type": "sell", "picture_url": "https://example.com/pictures/bicycle.jpg"}' -- ad data in JSON format
);

SELECT upsert_activity(
    gen_random_uuid(), -- activity ID
    'post', -- verb
    'b8dcbf13-cb01-4a35-93d5-5a5f5a2f6c01', -- actor ID (Alice)
    'b8dcbf13-cb01-4a35-93d5-5a5f5a2f6c02', -- object ID (Alice's ad)
    NULL, -- target ID (no target)
    59.9311, -- latitude
    30.3609 -- longitude
);
-- Charly's ad 
SELECT upsert_objectstorage(
    '99875f15-49ee-4e6d-b356-cbab4f4e4a4d', -- new object ID for Charlie's ad
    59.9375, -- latitude
    30.3086, -- longitude
    'ad', -- object type
    '{"description": "Books, various genres", "ad_type": "sell", "picture_url": "https://example.com/pictures/books.jpg"}' -- ad data in JSON format
);

SELECT upsert_activity(
    gen_random_uuid(), -- activity ID
    'post', -- verb
    '99875f15-49ee-4e6d-b356-cbab4f4e4a4c', -- actor ID (Charlie)
    '99875f15-49ee-4e6d-b356-cbab4f4e4a4d', -- object ID (Charlie's ad)
    NULL, -- target ID (no target)
    59.9428, -- latitude
    30.3071 -- longitude
);
-- Bob's ad
SELECT upsert_objectstorage(
    'cc7ebda2-019c-4387-925c-352f7e1f0b12', -- new object ID for Bob's ad
    59.9428, -- latitude
    30.3071, -- longitude
    'ad', -- object type
    '{"description": "Vintage record player, needs repair", "ad_type": "exchange", "picture_url": "https://example.com/pictures/record_player.jpg"}' -- ad data in JSON format
);
SELECT upsert_activity(
    gen_random_uuid(), -- activity ID
    'post', -- verb
    'cc7ebda2-019c-4387-925c-352f7e1f0b10', -- actor ID (Bob)
    'cc7ebda2-019c-4387-925c-352f7e1f0b12', -- object ID (Bob's ad)
    NULL, -- target ID (no target)
    59.9428, -- latitude
    30.3071 -- longitude
);

 

Now that we have created objects and activities, the activity stream will still be empty because actors need to follow each other to generate activity. Therefore, we need to establish follow relationships between users to create a stream showing their activities.

 

-- Follow data

-- Follow Eve and Alice to themselves
SELECT follow_user('8d7685d5-5b1f-4a7a-835e-b89e7d3a3b54', '8d7685d5-5b1f-4a7a-835e-b89e7d3a3b54');
SELECT follow_user('b8dcbf13-cb01-4a35-93d5-5a5f5a2f6c01', 'b8dcbf13-cb01-4a35-93d5-5a5f5a2f6c01');
-- Follow Eve and Alice to Bob, Charlie, and Dave
SELECT follow_user('8d7685d5-5b1f-4a7a-835e-b89e7d3a3b54', 'cc7ebda2-019c-4387-925c-352f7e1f0b10');
SELECT follow_user('b8dcbf13-cb01-4a35-93d5-5a5f5a2f6c01', 'cc7ebda2-019c-4387-925c-352f7e1f0b10');
SELECT follow_user('8d7685d5-5b1f-4a7a-835e-b89e7d3a3b54', '99875f15-49ee-4e6d-b356-cbab4f4e4a4c');
SELECT follow_user('b8dcbf13-cb01-4a35-93d5-5a5f5a2f6c01', '99875f15-49ee-4e6d-b356-cbab4f4e4a4c');
SELECT follow_user('8d7685d5-5b1f-4a7a-835e-b89e7d3a3b54', '34f6c0a5-5d5e-463f-a2cf-11b7529a92a1');
SELECT follow_user('b8dcbf13-cb01-4a35-93d5-5a5f5a2f6c01', '34f6c0a5-5d5e-463f-a2cf-11b7529a92a1');

-- Follow data

 

It’s time to test our activity stream first lets try to get the followers for the user Alice

SELECT get_following_ids('b8dcbf13-cb01-4a35-93d5-5a5f5a2f6c01') -- get the objects that Allice is following

here is the result

get_following_ids                                                                                                                                    
-----------------------------------------------------------------------------------------------------------------------------------------------------+
{
b8dcbf13-cb01-4a35-93d5-5a5f5a2f6c01,
cc7ebda2-019c-4387-925c-352f7e1f0b10,
99875f15-49ee-4e6d-b356-cbab4f4e4a4c,
34f6c0a5-5d5e-463f-a2cf-11b7529a92a1
}

Now lets get the activities of the objects that Alice is following, we will get page 1 and how 10 records per page

SELECT * FROM get_activities_by_following(1, 10, ARRAY(SELECT get_following_ids('b8dcbf13-cb01-4a35-93d5-5a5f5a2f6c01')));

here is the result

id                                  |verb|actor_id                            |object_id                           |target_id|
------------------------------------+----+------------------------------------+------------------------------------+---------+
f905356f-2fe3-4f51-b6de-d2cd107f46b8|post|cc7ebda2-019c-4387-925c-352f7e1f0b10|f6c7599e-8161-4d54-82ec-faa13bb8cbf7|         |
43a92964-5bcd-4096-93bc-e5e87c76455e|post|99875f15-49ee-4e6d-b356-cbab4f4e4a4c|41f7c558-1cf8-4f2b-b4b4-4d4e4df50843|         |
69ec53ac-bbaa-4c36-81ef-8764647d7914|post|34f6c0a5-5d5e-463f-a2cf-11b7529a92a1|c3dd7b47-1bba-4916-8a6a-8b5f2b50ba88|         |
de6b052f-8a84-4b37-9920-9f76cbb539d4|post|b8dcbf13-cb01-4a35-93d5-5a5f5a2f6c01|b8dcbf13-cb01-4a35-93d5-5a5f5a2f6c02|         |
3c35544a-3ee0-47ae-bddc-1017127ff4d6|post|99875f15-49ee-4e6d-b356-cbab4f4e4a4c|99875f15-49ee-4e6d-b356-cbab4f4e4a4d|         |
e76dcbb9-56c4-46d8-bb42-2f67dec4c5aa|post|cc7ebda2-019c-4387-925c-352f7e1f0b10|cc7ebda2-019c-4387-925c-352f7e1f0b12|         |


 

Now lets makes this better and get the activities in JSON format

SELECT * FROM get_activities_by_following_as_json(1, 2, 'b8dcbf13-cb01-4a35-93d5-5a5f5a2f6c01');

and here is the result

[
   {
      "id":"43a92964-5bcd-4096-93bc-e5e87c76455e",
      "verb":"post",
      "actor":{
         "age":42,
         "name":"Charlie",
         "email":"charlie@example.com",
         "picture_url":"https://example.com/pictures/charlie.jpg"
      },
      "object":{
         "ad_type":"sale",
         "description":"Smartphone, unlocked",
         "picture_url":"https://example.com/pictures/smartphone.jpg"
      },
      "target":null,
      "actor_id":"99875f15-49ee-4e6d-b356-cbab4f4e4a4c",
      "latitude":59.937500,
      "longitude":30.308600,
      "object_id":"41f7c558-1cf8-4f2b-b4b4-4d4e4df50843",
      "target_id":null,
      "created_at":"2023-03-12T17:54:11.636928+03:00",
      "updated_at":"2023-03-12T17:54:11.636928+03:00"
   },
   {
      "id":"f905356f-2fe3-4f51-b6de-d2cd107f46b8",
      "verb":"post",
      "actor":{
         "age":33,
         "name":"Bob",
         "email":"bob@example.com",
         "picture_url":"https://example.com/pictures/bob.jpg"
      },
      "object":{
         "ad_type":"sale",
         "description":"Vintage bicycle, good condition",
         "picture_url":"https://example.com/pictures/bicycle.jpg"
      },
      "target":null,
      "actor_id":"cc7ebda2-019c-4387-925c-352f7e1f0b10",
      "latitude":59.942800,
      "longitude":30.307100,
      "object_id":"f6c7599e-8161-4d54-82ec-faa13bb8cbf7",
      "target_id":null,
      "created_at":"2023-03-12T17:54:11.636928+03:00",
      "updated_at":"2023-03-12T17:54:11.636928+03:00"
   }
]

 

And now before I go, here is a good , this query will return all the activities that happened around a specific geo location

 

SELECT get_activities_by_distance_as_json(59.9343, 30.3351, 1600, 1, 10);

Here are the results, all those places are near my home ))

         "name":"Charlie",
         "email":"charlie@example.com",
         "picture_url":"https://example.com/pictures/charlie.jpg"
      },
      "object":{
         "ad_type":"sale",
         "description":"Smartphone, unlocked",
         "picture_url":"https://example.com/pictures/smartphone.jpg"
      },
      "target":null,
      "actor_id":"99875f15-49ee-4e6d-b356-cbab4f4e4a4c",
      "latitude":59.937500,
      "longitude":30.308600,
      "object_id":"41f7c558-1cf8-4f2b-b4b4-4d4e4df50843",
      "target_id":null,
      "created_at":"2023-03-12T17:54:11.636928+03:00",
      "updated_at":"2023-03-12T17:54:11.636928+03:00"
   },
   {
      "id":"e5e26df0-e58f-4b25-96c1-5b3460beb0c8",
      "verb":"post",
      "actor":{
         "age":25,
         "name":"Eve",
         "email":"eve@example.com",
         "picture_url":"https://example.com/pictures/eve.jpg"
      },
      "object":{
         "ad_type":"want",
         "description":"Plants, various types",
         "picture_url":"https://example.com/pictures/plants.jpg"
      },
      "target":null,
      "actor_id":"8d7685d5-5b1f-4a7a-835e-b89e7d3a3b54",
      "latitude":59.937500,
      "longitude":30.314100,
      "object_id":"3453f3c1-296d-47a5-a5a5-f5db5ed3f3b3",
      "target_id":null,
      "created_at":"2023-03-12T17:54:11.636928+03:00",
      "updated_at":"2023-03-12T17:54:11.636928+03:00"
   },
   {
      "id":"de6b052f-8a84-4b37-9920-9f76cbb539d4",
      "verb":"post",
      "actor":{
         "age":27,
         "name":"Alice",
         "email":"alice@example.com",
         "picture_url":"https://example.com/pictures/alice.jpg"
      },
      "object":{
         "ad_type":"sell",
         "description":"Used bicycle, good condition",
         "picture_url":"https://example.com/pictures/bicycle.jpg"
      },
      "target":null,
      "actor_id":"b8dcbf13-cb01-4a35-93d5-5a5f5a2f6c01",
      "latitude":59.931100,
      "longitude":30.360900,
      "object_id":"b8dcbf13-cb01-4a35-93d5-5a5f5a2f6c02",
      "target_id":null,
      "created_at":"2023-03-12T17:54:11.636928+03:00",
      "updated_at":"2023-03-12T17:54:11.636928+03:00"
   }
]

 

In conclusion, this article provided a step-by-step guide on how to create an activity stream system using PostgreSQL as the database storage. The article covered various aspects, such as the creation of the object storage table, activity table, follow functionality, and pagination to handle the huge amount of data generated by users. Additionally, the article discussed the use of PostGIS extensions for geographical search and the benefits of using JSON columns in PostgreSQL to store complex data structures. Overall, the article provided a comprehensive guide to building an activity stream system that can handle a large volume of data efficiently. By following this guide, developers can create their own activity stream systems using PostgreSQL and implement them into their applications.

You can find the complete code for this tutorial here :

https://github.com/egarim/PostgresActivityStream

 

 

 

 

Running DotNet applications as a service in macOS

Running DotNet applications as a service in macOS

Running an application as a service in macOS can be accomplished using a tool called launchd. launchd is a system-wide daemon manager that is built into macOS. It can be used to run applications automatically at startup, or on a schedule. In this article, we’ll show you how to run an application as a service using launchd, and explain the difference between LaunchDaemons and LaunchAgents.

Step 1: Create a launchd plist file The first step in running an application as a service using launchd is to create a property list file called a “launchd plist”. This file defines the service and its settings, including the name, program to run, and any environment variables or other settings. The plist file is an XML file and should have the extension .plist. An example of a plist file is the following:

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
    <key>Label</key>
    <string>com.example.myservice</string>
    <key>ProgramArguments</key>
    <array>
        <string>/path/to/your/program</string>
    </array>
    <key>KeepAlive</key>
    <true/>
</dict>
</plist>

Step 2: Copy the plist file to the appropriate directory Once the plist file is created, you will need to copy it to the appropriate directory. There are two main directories for launchd plist files: /Library/LaunchDaemons/ and /Library/LaunchAgents/.

The difference between LaunchDaemons and LaunchAgents is that LaunchDaemons are used to run background services that do not require a user to be logged in, while LaunchAgents are used to run applications that are associated with a specific user account.

For example, if you want to run a service that starts automatically at boot time and runs in the background, you would copy the plist file to /Library/LaunchDaemons/. If you want to run an application that starts automatically when a user logs in, you would copy the plist file to /Library/LaunchAgents/.

Step 3: Load the service Once the plist file is in the appropriate directory, use the launchctl command to load the service. For example:

sudo launchctl load /Library/LaunchDaemons/com.example.myservice.plist

Step 4: Verify that the service is running You can use the launchctl command to check the status of the service:

launchctl list

This will show all the services that are currently running, and their status.

In summary, launchd is a powerful tool built into macOS that allows you to run applications as services. To use launchd, you need to create a plist file that defines the service and its settings, then copy it to the appropriate directory, and load it using the launchctl command. The difference between LaunchDaemons and LaunchAgents is that the former are used to run background services that do not require a user to be logged in, while the latter are used to run applications that are associated with a specific user account.

Entity Framework Core & lazy loading

Entity Framework Core & lazy loading

In Entity Framework 7, lazy loading is a technique used to delay the loading of related entities until they are actually needed. This can help to improve the performance of an application by reducing the amount of data that is retrieved from the database upfront.

To implement lazy loading in EF7, the “virtual” modifier is used on the navigation properties of an entity class. Navigation properties are used to represent relationships between entities, such as one-to-many or many-to-many.

For example, consider the following code snippet for a “Course” entity class and a “Student” entity class in EF7, with a one-to-many relationship between them:

public class Course
{
    public int CourseId { get; set; }
    public string CourseName { get; set; }
    public virtual ICollection<Student> Students { get; set; }
}

public class Student
{
    public int StudentId { get; set; }
    public string FirstName { get; set; }
    public string LastName { get; set; }
    public int CourseId { get; set; }
    public virtual Course Course { get; set; }
}

 

In this example, the “Students” navigation property in the “Course” class and the “Course” navigation property in the “Student” class are both marked as “virtual”. This allows EF7 to override these properties with a proxy at runtime, enabling lazy loading for the related entities.

To use lazy loading in an EF7 application, the “DbContext.LazyLoadingEnabled” property must be set to “true”. When lazy loading is enabled, related entities will not be loaded from the database until they are actually accessed.

For example, the following code demonstrates how lazy loading can be used to retrieve a list of courses and their students:

using (var context = new SchoolContext())
{
    context.LazyLoadingEnabled = true;
    var courses = context.Courses.ToList();
    foreach (var course in courses)
    {
        Console.WriteLine("Course: " + course.CourseName);
        foreach (var student in course.Students)
        {
            Console.WriteLine("Student: " + student.FirstName + " " + student.LastName);
        }
    }
}

In this code, the list of courses is retrieved from the database and stored in the “courses” variable. The students for each course are not retrieved until they are accessed in the inner loop. This allows the application to retrieve only the data that is needed, improving performance and reducing the amount of data transferred from the database.

Lazy loading can be a useful tool for optimizing the performance of an EF7 application, but it is important to consider the trade-offs and use it appropriately. Lazy loading can increase the number of database queries and the overall complexity of an application, and it may not always be the most efficient approach.

 

5 Good Practices for Integration Testing with NUnit

5 Good Practices for Integration Testing with NUnit

Integration tests are a crucial part of any software development process, as they help ensure that different parts of a system are working together correctly. When writing integration tests, it is important to follow best practices in order to ensure that your tests are effective and maintainable. Here are a few good practices for integration testing using NUnit:

  1. Use test fixtures: NUnit provides a concept called “test fixtures,” which allow you to set up and tear down common resources that are needed by multiple test cases. This can help reduce duplication and make your tests more maintainable.
    [TestFixture]
    public class DatabaseTests
    {
        private Database _database;
    
        [SetUp]
        public void SetUp()
        {
            _database = new Database();
        }
    
        [TearDown]
        public void TearDown()
        {
            _database.Dispose();
        }
    
        [Test]
        public void Test1()
        {
            // test code goes here
        }
    
        [Test]
        public void Test2()
        {
            // test code goes here
        }
    }
    

     

  2. Use setup and teardown methods: In addition to test fixtures, NUnit also provides setup and teardown methods that can be used to perform common tasks before and after each test case. This can be helpful for setting up test data or cleaning up after a test.
    [TestFixture]
    public class DatabaseTests
    {
        private Database _database;
    
        [SetUp]
        public void SetUp()
        {
            _database = new Database();
        }
    
        [TearDown]
        public void TearDown()
        {
            _database.Dispose();
        }
    
        [SetUp]
        public void TestSetup()
        {
            // setup code goes here
        }
    
        [TearDown]
        public void TestTeardown()
        {
            // teardown code goes here
        }
    
        [Test]
        public void Test1()
        {
            // test code goes here
        }
    
        [Test]
        public void Test2()
        {
            // test code goes here
        }
    }
    

     

  3. Use test cases: NUnit allows you to specify multiple test cases for a single test method using the TestCase attribute. This can help reduce duplication and make it easier to test different scenarios.
    [TestFixture]
    public class CalculatorTests
    {
        [TestCase(1, 2, 3)]
        [TestCase(10, 20, 30)]
        [TestCase(-1, -2, -3)]
        public void TestAdd(int x, int y, int expected)
        {
            Calculator calculator = new Calculator();
            int result = calculator.Add(x, y);
            Assert.AreEqual(expected, result);
        }
    }
    

     

  4. Use the Assert class: NUnit provides a variety of assertions that can be used to verify the behavior of your code. It is important to use these assertions rather than manually checking for expected results, as they provide better error messages and make it easier to debug test failures.
    [TestFixture]
    public class CalculatorTests
    {
        [Test]
        public void TestAdd()
        {
            Calculator calculator = new Calculator();
            int result = calculator.Add(1, 2);
            Assert.AreEqual(3, result);
        }
    
        [Test]
        public void TestSubtract()
        {
            Calculator calculator = new Calculator();
            int result = calculator.Subtract(10, 5);
            Assert.AreEqual(5, result);
        }
    }
    

     

  5. Use test categories: NUnit allows you to categorize your tests using the Category attribute. This can be helpful for organizing your tests and selectively running only certain categories of tests.
    [TestFixture]
    public class DatabaseTests
    {
        [Test]
        [Category("Database")]
        public void Test1()
        {
            // test code goes here
        }
    
        [Test]
        [Category("Database")]
        public void Test2()
        {
            // test code goes here
        }
    
        [Test]
        [Category("API")]
        public void Test3()
        {
            // test code goes here
        }
    }
       
    

     

By following these best practices, you can write integration tests that are effective, maintainable, and easy to understand. This will help you ensure that your code is working correctly and reduce the risk of regressions as you continue to develop and evolve your system.

Moving to apple silicon as a DotNet Developer

Moving to apple silicon as a DotNet Developer

ARM (Advanced RISC Machine) is a popular architecture for mobile devices and other low-power devices. Microsoft has supported ARM architectures in the .NET framework for many years, and this support has continued with the release of .NET 6 and .NET 7.

In .NET 6 and 7, support for ARM architectures has been improved and expanded in several ways. One of the key changes is the introduction of ARM64 JIT (Just-In-Time) compilation, which allows .NET applications to take advantage of the performance improvements offered by the ARM64 architecture. This means that .NET applications can now be compiled and run natively on ARM64 devices, providing better performance and a more seamless experience for users.

Another important change in .NET 6 and 7 is the support for ARM32 and ARM64 for ASP.NET and ASP.NET Core. This means that developers can now build and deploy web applications on ARM devices, making it easier to create cross-platform applications that can run on a wide range of devices.

In addition to these changes, .NET 6 and 7 also include support for ARM64 in the .NET Native toolchain, which allows developers to build native applications for ARM devices using C# and .NET. This makes it easier to create high-performance, native applications for ARM devices without having to write code in a different language.

In conclusion, the support for ARM architectures in .NET 6 and 7 is an important development for developers who are looking to create and deploy applications on devices such as Apple’s M1 and M2. With this support, developers can take advantage of the performance and capabilities of the ARM architecture to create powerful and efficient applications that can run on a variety of devices. This will make it easier for developers to create and deploy applications on a wide range of devices, including mobile devices and other low-power devices. Overall, the support for ARM architectures in .NET 6 and 7 is a major improvement that will help developers create and deploy high-quality applications on a variety of devices.