Convert Simple Array into Two-Dimensional Array (Matrix)

Convert simple array into two-dimensional array (matrix)

Something like this?

function listToMatrix(list, elementsPerSubArray) {
var matrix = [], i, k;

for (i = 0, k = -1; i < list.length; i++) {
if (i % elementsPerSubArray === 0) {
k++;
matrix[k] = [];
}

matrix[k].push(list[i]);
}

return matrix;
}

Usage:

var matrix = listToMatrix([1, 2, 3, 4, 4, 5, 6, 7, 8, 9], 3);
// result: [[1, 2, 3], [4, 5, 6], [7, 8, 9]]

Convert a 1D array to 2D array

You can use this code :

const arr = [1,2,3,4,5,6,7,8,9];

const newArr = [];
while(arr.length) newArr.push(arr.splice(0,3));

console.log(newArr);

Converting One Dimensional Array to Two Dimensional

The problem is that you do not tell JavaScript that the Objects stored in the outer-most array should be an Array. When you tried to use the inner arrays, you had not yet initialized them, so you were getting undefined. To fix this, you can simply change your code to:

var k=0; 
var cubes = [];
var i, j;
for(i=0; i<n; i++){
cubes[i] = [];
for(j=0; j<=n; j++){
cubes[i][j]=matriks[k];
document.write("["+i+"]["+j+"] = "+cubes[i][j].value+" ");
k++;
}
j=0;
}

How to convert three array to a matrix(Two dimensional array) in java

You could put the ai-Arrays into a list and iterate over it to fill your 2d-array x as follows:

    double[] a1 = {2.1, 2.2, 2.3, 2.4, 2.5};
double[] a2 = {3.1, 3.2, 3.3, 3.4, 3.5};
double[] a3 = {4.1, 4.2, 4.3, 4.4, 4.5};

final List<double[]> aList = Arrays.asList(a1, a2, a3);

double[][] x = new double[a1.length][3];
for (int i = 0; i < a1.length; i++) {
for (int j = 0; j < aList.size(); j++) {
x[i][j] = aList.get(j)[i];
}
}

Remarks:

  • initialize x outside of loop
  • start with a small letter for variables
  • java-style array declaration is of form type[] name

Convert 1D array into 2D array JavaScript

var array1 = [15, 33, 21, 39, 24, 27, 19, 7, 18, 28, 30, 38];
var i, j, t;
var positionarray1 = 0;
var array2 = new Array(4);

for (t = 0; t < 4; t++) {
array2[t] = new Array(3);
}

for (i = 0; i < 4; i++) {
for (j = 0; j < 3; j++) {
array2[i][j] = array1[i*3+j]; //here was the error
}

positionarray1 = positionarray1 + 1; //I do this to know which value we are taking
}

console.log(array2);

I just solved it thanks for your comments. I actually used one apportation, but it was 3 instead of 2.

Convert a 1D array to a 2D array in numpy

You want to reshape the array.

B = np.reshape(A, (-1, 2))

where -1 infers the size of the new dimension from the size of the input array.

How efficiently to convert one dimensional array to two dimensional array in swift3

You can write something like this:

var pixels: [UInt8] = [0,1,2,3, 4,5,6,7, 8,9,10,11, 12,13,14,15]
let bytesPerRow = 4
assert(pixels.count % bytesPerRow == 0)
let pixels2d: [[UInt8]] = stride(from: 0, to: pixels.count, by: bytesPerRow).map {
Array(pixels[$0..<$0+bytesPerRow])
}

But with the value semantics of Swift Arrays, all attempt to create new nested Array requires copying the content, so may not be "efficient" enough for your purpose.

Re-consider if you really need such nested Array.

Converting a 1D array to a 2D array

You're assigning the wrong value to arr[i][j]:

for (int i = 0; i <arr.length ; i++) {
for (int j = 0; j <arr[j].length ; j++) {
arr[i][j] = array[i * arr[j].length + j];
}
}

How can I create a two dimensional array in JavaScript?

Practically? Yes. You can create an array of arrays which functions as an 2D array as every item is an array itself:

let items = [
[1, 2],
[3, 4],
[5, 6]
];
console.log(items[0][0]); // 1
console.log(items[0][1]); // 2
console.log(items[1][0]); // 3
console.log(items[1][1]); // 4
console.log(items);


Related Topics



Leave a reply



Submit