How to Convert a Database Row into a Struct

How do I convert a database row into a struct

Here's one way to do it - just assign all of the struct values manually in the Scan function.

func getUser(name string) (*User, error) {
var u User
// this calls sql.Open, etc.
db := getConnection()
// note the below syntax only works for postgres
err := db.QueryRow("SELECT * FROM users WHERE name = $1", name).Scan(&u.Id, &u.Name, &u.Score)
if err != nil {
return &User{}, err
} else {
return &u, nil
}
}

Scanning large rows into structs in Go

the ultimate goal of this function is to return a JSON array of objects

It sounds like you could byass the struct entirely then, and instead scan into a map[string]interface{}, and do it all pretty dynamically:
You could do something like this:

rows, _ := db.Query("SELECT * FROM coderyte") 
cols, _ := rows.Columns()
store := []map[string]interface{}
for rows.Next() {
columns := make([]interface{}, len(cols))
columnPointers := make([]interface{}, len(cols))
for i, _ := range columns {
columnPointers[i] = &columns[i]
}

if err := rows.Scan(columnPointers...); err != nil {
return err
}
m := make(map[string]interface{})
for i, colName := range cols {
val := columnPointers[i].(*interface{})
m[colName] = *val
}
store = append(store, m)
}
js, _ := json.Marshal(store)
fmt.Println(string(js))

Now, obviously you could also convert it to a struct, since you could take the json and do json.Unmarshal, but given your use case that seems like a pointless extra step.

js, _ := json.Marshal(store)
structs := []Coderyte{}
json.Unmarshal(js, &structs)

All that being said, you should probably just use sqlx - they probably do way cleverer things and do it way more efficiently.

golang: convert row sql to object

You can use something like this (not yet tested and need optimization):

func (r *personRepository) GetAll() (persons []*entities.Person, err error) {
qry := `select id, first_name, last_name, email, password, created_at, updated_at from persons`
rows, err := r.conn.Query(context.Background(), qry)

var items []*entities.Person
if err != nil {
// No result found with the query.
if err == pgx.ErrNoRows {
return items, nil
}

// Error happened
log.Printf("can't get list person: %v\n", err)
return items, err
}

defer rows.Close()

for rows.Next() {
// Build item Person for earch row.
// must be the same with the query column position.

var id, firstName, lastName, email, password string
var createdAt, updatedAt time.Time

err = rows.Scan(&id, &firstName, &lastName, &email,
&createdAt, updatedAt)
if err != nil {
log.Printf("Failed to build item: %v\n", err)
return items, err
}

item := &entities.Person{
Id: id,
FirstName: firstName,
// fill other value
}

// Add item to the list.
items = append(items, item)
}

return items, nil
}

Don't forget to add the comma after text password in your query.


I am using entities and value objects without value object it seems easy using marshal,

Sorry, I don't know about the value object in your question.

Convert a postgres row into golang struct with array field

Thanks to https://stackoverflow.com/a/44385791/10138004, I am able to solve this pq driver for sqlx (https://godoc.org/github.com/lib/pq) itself by replacing []string with pq.StringArray.

So, updated struct looks like:

type Foo struct {
Name string `db:"name"`
Types pq.StringArray `db:"types"` //this is what changed.
Role string `db:"role"`
}

and direct mapping is working like a charm now

var foo []Foo
query := `SELECT name, types, roles FROM foo`
dbWrapper.err = dbConn.Select(&foo, query)

How do I convert my Table into a Normalised structure?

A friend wrote me some python code, which does (almost) what I need.

filein=open("skills.csv","r")
fileout=open("out.txt","w")
professionline = filein.readline()
professions = professionline.rstrip().split(",")
for line in filein.readlines()[1:]:
fields=line.rstrip().split(",")
skill=fields[0]
i=1
for field in fields[1:]:
cost=field
if cost == "":
cost = '0'

fileout.write("INSERT INTO skillCosts (skill_id, profession_id, cost) VALUES ((SELECT _id FROM skills WHERE category=\"{}\"), (SELECT _id FROM professions WHERE profession=\"{}\"), {});\n".format(skill, professions[i], cost))
i+=1

This loads the .csv, and converts the top line (Profession Headers) into an array.

It then takes each row of Costs and creates an SQL Query.

This set of queries is written to a file (this example uses .txt, where I can copy/paste into DB Browswer's sql-input section)

It is possible to format a single Query to INSERT data, but my table goes over the 1,000 item limit.

c++, create structs from a database result

What you are looking for is something called "reflection", where names of variables, member fields etc can be used to refer to the actual variable/field or whatever. C++ does not support this. You will need to solve this in some other way... A plausible solution is:

StructToAdd.ID = strtol(row[ROWSTRUCTURE_FIELD_ENUM_ID]);
StructToAdd.test = row[ROWSTRUCTURE_FIELD_ENUM_test];

Or, as the comment suggests:

struct ROWSTRUCTURE
{
int ID;
std::string test;
ROWSTRUCTURE operator=(const MYSQL_ROW &row)
{
ID = strtol(row[ROWSTRUCTURE_FIELD_ENUM_ID]);
test = row[ROWSTRUCTURE_FIELD_ENUM_test];
return *this;
}
};

and then in the "main" function:

StructToAdd = row;

Convert flat SQL query result into table structure

look at using pivot

SELECT 'Amount' AS MaxAmount, [Group1], [Group2], [Group3]
FROM
(SELECT descrip, Amount
FROM Table) AS SourceTable
PIVOT
(
Max(Amount)
FOR descrip IN ([Group1], [Group2], [Group3])
) AS PivotTable;

Need to put in the appropriate aggregate function, for this example I used max.

How to create a system to store and load any struct from database?

Note: I deleted my previous answer since it was on an entirely different interpretation of the question; I still leave it (deleted) as reference if anybody wishes to see it.

What you are asking for could be done in a language having reflection. Full blown reflection fortunately is not required, still C++ does not offer any out of the box.

Warning: I personally consider the need dubious. Asking for a 1-to-1 mapping between a structure (and its field names) and the database table (and its column names) may cost more in terms of maintenance than it is worth. The notable question is how to retain backward/forward compatibility when changing the database schema. /Warning


A first standard way to add in reflection is to consider BOOST_FUSION_ADAPT_STRUCT.

struct Person {
struct Name: Field<std::string> { static char const* N() { return "name"; } }
struct Age: Field<int> { static char const* N() { return "age"; } }

Name name;
Age age;
}; // struct Person

BOOST_FUSION_ADAPT_STRUCT(
Person,
(Name, name)
(Age, age))

It's a double-trick:

  • A Fusion sequence can be iterated over at runtime
  • Each attribute is augmented by a name

Note: the same thing can be achieved by using a Boost.Fusion.Map sequence.

Of course, this is not perfect since it actually requires you to alter the definition of the structure. It is probably possible, though might be tough, to come up with an equivalent macro that does the name-augmenting in place to avoid having to alter the base structure and if you managed that it would be worth posting your own solution :)


Another solution is to actually go for a fuller reflection based approach. A solution to obtain reflection based on C99 variadic macros was added some type ago by @Paul here.

If you are into macros, I would note it could probably be adapted via Boost.Preprocessor to work with its sequences (would add a couple more parentheses but be standard).

Once again, though, this require modifying the declaration of the structure.


Personally, I would love to see the first solution enhanced to adapt 3rd party structures. On the other hand, given the strong coupling introduced between your Object Model and the Database Schema it might be best to restrain yourself to your own types already.



Related Topics



Leave a reply



Submit