Convert Array into Csv

Dump a NumPy array into a csv file

numpy.savetxt saves an array to a text file.

import numpy
a = numpy.asarray([ [1,2,3], [4,5,6], [7,8,9] ])
numpy.savetxt("foo.csv", a, delimiter=",")

Convert array into csv

I'm using the following function for that; it's an adaptation from one of the man entries in the fputscsv comments. And you'll probably want to flatten that array; not sure what happens if you pass in a multi-dimensional one.

/**
* Formats a line (passed as a fields array) as CSV and returns the CSV as a string.
* Adapted from http://us3.php.net/manual/en/function.fputcsv.php#87120
*/
function arrayToCsv( array &$fields, $delimiter = ';', $enclosure = '"', $encloseAll = false, $nullToMysqlNull = false ) {
$delimiter_esc = preg_quote($delimiter, '/');
$enclosure_esc = preg_quote($enclosure, '/');

$output = array();
foreach ( $fields as $field ) {
if ($field === null && $nullToMysqlNull) {
$output[] = 'NULL';
continue;
}

// Enclose fields containing $delimiter, $enclosure or whitespace
if ( $encloseAll || preg_match( "/(?:${delimiter_esc}|${enclosure_esc}|\s)/", $field ) ) {
$output[] = $enclosure . str_replace($enclosure, $enclosure . $enclosure, $field) . $enclosure;
}
else {
$output[] = $field;
}
}

return implode( $delimiter, $output );
}

How to convert Array to Array list for CSV export using FputCSV?

Just cast the objects to arrays with (array):

$file = fopen($CsvFile, 'w');
fputcsv($file, get_object_vars(reset($data)));

foreach ($data as $row) {
fputcsv($file, (array)$row);
}
fclose($file);

If the object properties and $HeaderFields are not the same then use $HeaderFields instead.

How to convert multiple arrays to CSV columns using JavaScript?

Your csvConstructor() is iterating columns individually, but for a table, you need to iterate columns in parallel. Also, it is trying to juggle two different concerns at the same time (transforming the data, and constructing the csv string), so it is a bit difficult to follow.

The problem is logically split into two phases. In the first phase, you want a well-formatted array that reflects the CSV's tabular structure (or if it's a large data set, then you probably want an iterator that yields rows one by one, but I won't do that version here).

The output should be in this format:

const result = [
[header A, header B],
[value A, value B],
[value A, value B],
// ...etc...
]

Once we have that structure, we can transform it into CSV.

So to get that based on your data:

function toCsvRows(headers, columns) {
const output = [headers]
const numRows = columns.map(col => col.length)
.reduce((a, b) => Math.max(a, b))

for (let row = 0; row < numRows; row++) {
output.push(columns.map(c => c[row] || ''))
}

return output
}

function toCsvString(data) {
let output = ''
data.forEach(row => output += row.join(',') + '\n')
return output
}

function csvConstructor(headers, columns) {
return toCsvString(toCsvRows(headers, columns))
}

Here's a working fiddle:

https://jsfiddle.net/foxbunny/fdxp3bL5/

EDIT: Actually, let me do the memory-efficient version as well:

function *toCsvRows(headers, columns) {
yield headers

const numRows = columns.map(col => col.length)
.reduce((a, b) => Math.max(a, b))

for (let row = 0; row < numRows; row++) {
yield columns.map(c => c[row] || '')
}
}

function toCsvString(data) {
let output = ''
for (let row of data) {
output += row.join(',') + '\n'
}
return output
}

function csvConstructor(headers, columns) {
return toCsvString(toCsvRows(headers, columns))
}

PHP Array to CSV

Instead of writing out values consider using fputcsv().

This may solve your problem immediately.

Note from comment: I should mention that this will be making a file on your server, so you'll need to read that file's contents before outputting it, also if you don't want to save a copy then you'll need to ùnlink`the file when you are done

JavaScript array to CSV

The cited answer was wrong. You had to change

csvContent += index < infoArray.length ? dataString+ "\n" : dataString;

to

csvContent += dataString + "\n";

As to why the cited answer was wrong (funny it has been accepted!): index, the second parameter of the forEach callback function, is the index in the looped-upon array, and it makes no sense to compare this to the size of infoArray, which is an item of said array (which happens to be an array too).

EDIT

Six years have passed now since I wrote this answer. Many things have changed, including browsers. The following was part of the answer:

START of aged part

BTW, the cited code is suboptimal. You should avoid to repeatedly append to a string. You should append to an array instead, and do an array.join("\n") at the end. Like this:

var lineArray = [];
data.forEach(function (infoArray, index) {
var line = infoArray.join(",");
lineArray.push(index == 0 ? "data:text/csv;charset=utf-8," + line : line);
});
var csvContent = lineArray.join("\n");

END of aged part

(Keep in mind that the CSV case is a bit different from generic string concatenation, since for every string you also have to add the separator.)

Anyway, the above seems not to be true anymore, at least not for Chrome and Firefox (it seems to still be true for Safari, though).

To put an end to uncertainty, I wrote a jsPerf test that tests whether, in order to concatenate strings in a comma-separated way, it's faster to push them onto an array and join the array, or to concatenate them first with the comma, and then directly with the result string using the += operator.

Please follow the link and run the test, so that we have enough data to be able to talk about facts instead of opinions.



Related Topics



Leave a reply



Submit