What are good approaches to query data from SQL Database and map it into a structure which contains a slice of another structure type?
In my case, I am using Postgres SQL
, pgx
to query data, and scany
to map the row results into objects.
I have a posts table and comments table. The relationship is that one post can have many comments.
These are my models representing the database tables
package models
type Post struct {
Id int64
// more fields
Comments []Comment
}
type Comment struct {
Id int64
PostId int64
// more fields
Post Post
}
This is my PostRepository
object
package repository
import (
"context"
"fmt"
"github.com/georgysavva/scany/pgxscan"
"github.com/jackc/pgx/v4/pgxpool"
"kosev.com/databases/models"
)
type PostRepository struct {
dbPool *pgxpool.Pool
}
func NewPostRepository(dbPool *pgxpool.Pool) *PostRepository {
return &PostRepository{
dbPool: dbPool,
}
}
func (pr *PostRepository) GetAll() []models.Post {
query := `SELECT * FROM posts
LEFT OUTER JOIN "comments"
ON "comments".post_id = posts.id;`
var posts []models.Post
rows, _ := pr.dbPool.Query(context.Background(), query)
err := pgxscan.ScanAll(&posts, rows)
if err != nil {
fmt.Println("Error: ", err.Error())
}
return posts
}
What I am trying to achieve is to get all posts
and in each post to have a slice of the corresponding comments
for this post
Now I know that I can
- query all posts
- loop through the posts and build a query that will get me the comments for each post
- add the appropriate comments into the slice of the appropriate post
I am wondering if there is the better solution, one which will only require me to execute the query and some Scan()
method that is smart enough to add the comments inside the posts.