How to Declare a Two Dimensional Array

How to define a two-dimensional array?

You're technically trying to index an uninitialized array. You have to first initialize the outer list with lists before adding items; Python calls this
"list comprehension".

# Creates a list containing 5 lists, each of 8 items, all set to 0
w, h = 8, 5
Matrix = [[0 for x in range(w)] for y in range(h)]

#You can now add items to the list:

Matrix[0][0] = 1
Matrix[6][0] = 3 # error! range...
Matrix[0][6] = 3 # valid

Note that the matrix is "y" address major, in other words, the "y index" comes before the "x index".

print Matrix[0][0] # prints 1
x, y = 0, 6
print Matrix[x][y] # prints 3; be careful with indexing!

Although you can name them as you wish, I look at it this way to avoid some confusion that could arise with the indexing, if you use "x" for both the inner and outer lists, and want a non-square Matrix.

How can I create a two dimensional array in JavaScript?

let items = [
[1, 2],
[3, 4],
[5, 6]
];
console.log(items[0][0]); // 1
console.log(items[0][1]); // 2
console.log(items[1][0]); // 3
console.log(items[1][1]); // 4
console.log(items);

What is the best way to make 2 dimensional array in C

This

int **arr = (int **)malloc(sizeof(int *) * 3);

is not a declaration or allocation of a two-dimensional array

Here a one-dimensional array with the element type int * is created. And then each element of the one-dimensional array in turn points to an allocated one dimensional array with the element type int.

This declaration of a two-dimensional array

    const int row = 3;
const int col = 4;

int arr[row][col] = {
{1,2,3,4},
{3,4,5,6},
{5,6,7,8}
};

is incorrect. Variable length arrays (and you declared a variable length array) may not be initialized in declaration.

You could write instead

    enum { row = 3, col = 4 };

int arr[row][col] = {
{1,2,3,4},
{3,4,5,6},
{5,6,7,8}
};

When such an array is passed to a function it is implicitly converted to pointer to its first element of the type int ( * )[col].

You could pass it to a function that has a parameter of the type of a variable length array the following way

void    my_func( size_t row, size_t col, int arr[row][col] )
{
printf("test2: %d", arr[0][1]);
}

Or if to place the definition of the enumeration before the function declaration

    enum { row = 3, col = 4 };

then the function could be also declared like

void    my_func( int arr[][col], size_t row )
{
printf("test2: %d", arr[0][1]);
}

Here is a demonstrative program that shows three different approaches. The first one when an array is defined with compile-time constants for array sizes. The second one when a variable length array is created. And the third one when a one-dimensional array of pointer to one-dimensional arrays are allocated dynamically.

#include <stdio.h>
#include <stdlib.h>
#include <string.h>

enum { row = 3, col = 4 };

void output1( int a[][col], size_t row )
{
for ( size_t i = 0; i < row; i++ )
{
for ( size_t j = 0; j < col; j++ )
{
printf( "%d ", a[i][j] );
}
putchar( '\n' );
}
}

void output2( size_t row, size_t col, int a[row][col] )
{
for ( size_t i = 0; i < row; i++ )
{
for ( size_t j = 0; j < col; j++ )
{
printf( "%d ", a[i][j] );
}
putchar( '\n' );
}
}

void output3( int **a, size_t row, size_t col )
{
for ( size_t i = 0; i < row; i++ )
{
for ( size_t j = 0; j < col; j++ )
{
printf( "%d ", a[i][j] );
}
putchar( '\n' );
}
}

int main(void)
{
int arr1[row][col] =
{
{1,2,3,4},
{3,4,5,6},
{5,6,7,8}
};

output1( arr1, row );
putchar( '\n' );

const size_t row = 3, col = 4;

int arr2[row][col];

memcpy( arr2, arr1, row * col * sizeof( int ) );

output2( row, col, arr2 );
putchar( '\n' );

int **arr3 = malloc( row * sizeof( int * ) );

for ( size_t i = 0; i < row; i++ )
{
arr3[i] = malloc( col * sizeof( int ) );
memcpy( arr3[i], arr1[i], col * sizeof( int ) );
}

output3( arr3, row, col );
putchar( '\n' );

for ( size_t i = 0; i < row; i++ )
{
free( arr3[i] );
}

free( arr3 );
}

The program output is

1 2 3 4 
3 4 5 6
5 6 7 8

1 2 3 4
3 4 5 6
5 6 7 8

1 2 3 4
3 4 5 6
5 6 7 8

Pay attention to that the function output2 can be used with the array arr1 the same way as it is used with the array arr2.

Declare an empty two-dimensional array in Javascript?

You can just declare a regular array like so:

var arry = [];

Then when you have a pair of values to add to the array, all you need to do is:

arry.push([value_1, value2]);

And yes, the first time you call arry.push, the pair of values will be placed at index 0.

From the nodejs repl:

> var arry = [];
undefined
> arry.push([1,2]);
1
> arry
[ [ 1, 2 ] ]
> arry.push([2,3]);
2
> arry
[ [ 1, 2 ], [ 2, 3 ] ]

Of course, since javascript is dynamically typed, there will be no type checker enforcing that the array remains 2 dimensional. You will have to make sure to only add pairs of coordinates and not do the following:

> arry.push(100);
3
> arry
[ [ 1, 2 ],
[ 2, 3 ],
100 ]

How to use a two dimensional array in function declaration statement?

The prototype

void mat(int [][],int [][],int ,int);  

should be

void mat(int [][10],int [][10],int ,int);  

You must have to specify the higher dimensions. Other way around, the above prototype is equivalent to

void mat(int (*)[10],int (*)[10],int ,int);  

int (*)[10] is a type (pointer to an array of 10 int) and without size 10 it is of incomplete type.

How to declare initialize and use 2 dimensional arrays in Java Script Using a text from html page

There is no "matrix" structure natively in the language. But you can create them without major problem as far as you "book" the required memory.

Let's say you would like to create a 3x3 matrix, first you have to create an Array which will store references to each row/column (depending of your point of view, of course).

function createMatrix(N, M) {
var matrix = new Array(N); // Array with initial size of N, not fixed!

for (var i = 0; i < N; ++i) {
matrix[i] = new Array(M);
}

return matrix;
}

Syntax for creating a two-dimensional array in Java

Try the following:

int[][] multi = new int[5][10];

... which is a short hand for something like this:

int[][] multi = new int[5][];
multi[0] = new int[10];
multi[1] = new int[10];
multi[2] = new int[10];
multi[3] = new int[10];
multi[4] = new int[10];

Note that every element will be initialized to the default value for int, 0, so the above are also equivalent to:

int[][] multi = new int[][]{
{ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 },
{ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 },
{ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 },
{ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 },
{ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 }
};

Two-dimensional array in Swift

Define mutable array

// 2 dimensional array of arrays of Ints 
var arr = [[Int]]()

OR:

// 2 dimensional array of arrays of Ints 
var arr: [[Int]] = []

OR if you need an array of predefined size (as mentioned by @0x7fffffff in comments):

// 2 dimensional array of arrays of Ints set to 0. Arrays size is 10x5
var arr = Array(count: 3, repeatedValue: Array(count: 2, repeatedValue: 0))

// ...and for Swift 3+:
var arr = Array(repeating: Array(repeating: 0, count: 2), count: 3)

Change element at position

arr[0][1] = 18

OR

let myVar = 18
arr[0][1] = myVar

Change sub array

arr[1] = [123, 456, 789] 

OR

arr[0] += 234

OR

arr[0] += [345, 678]

If you had 3x2 array of 0(zeros) before these changes, now you have:

[
[0, 0, 234, 345, 678], // 5 elements!
[123, 456, 789],
[0, 0]
]

So be aware that sub arrays are mutable and you can redefine initial array that represented matrix.

Examine size/bounds before access

let a = 0
let b = 1

if arr.count > a && arr[a].count > b {
println(arr[a][b])
}

Remarks:
Same markup rules for 3 and N dimensional arrays.



Related Topics



Leave a reply



Submit