Problems with memory access (?)
RedNifre
Posts: 84
Hi,
I'm currently writing a game, but I have problems loading a level correctly. I never worked with direct memory access and pointer arithmethic before, so I think that I'm doing something wrong when accessing the memory.
This is how I intend level loading to work:
- Levels are 33 bytes and are made out of 88 tiles (3 bit per tile). Tiles are stored little endian in the bytes
- 3 bits are then masked out and used as a table index to access the word for the tile for the HEL engine
The data section of my program looks like this:
To load a tile in the tile array of the HEL engine I do this:
Unfortunately this loads some rubbish tiles and then fills the largest part of the game field with zero-tiles.
The function is called with "1" as parameter.
What I mean is this:
1. set levelPointer to the beginning of the level that should be loaded
2. get the lowest 3 bit of the first byte of the current level
3. use this number to get the right word (a tile plus palette for HEL)
4. write that tile to the screen
I know that the level data, the word table and the calculation where to write the tile to are correct. So I guess that either the masking or the memory access is wrong. Do you see any problem in how I try to read the level data?
Thanks in advance
Post Edited (RedNifre) : 5/19/2007 3:25:41 AM GMT
I'm currently writing a game, but I have problems loading a level correctly. I never worked with direct memory access and pointer arithmethic before, so I think that I'm doing something wrong when accessing the memory.
This is how I intend level loading to work:
- Levels are 33 bytes and are made out of 88 tiles (3 bit per tile). Tiles are stored little endian in the bytes
- 3 bits are then masked out and used as a table index to access the word for the tile for the HEL engine
The data section of my program looks like this:
' triplets to word table triplet word _ word W word R word L word S word M word N word _ ' levels are 11x8 without border ' 1 line == 1 level == 33 bytes == 88 triplets (little endian) levels byte 73, 146, 36, ... byte 73... ...
To load a tile in the tile array of the HEL engine I do this:
PUB loadLevel(newlev) | levelPointer, i, gi levelPointer := @levels + (newlev - 1)*33 i := 0 gi := 0 repeat while i < 33 ' immer 3 bytes / 8 tiles auf einmal setzen ' ______________________ '-2--2 1--1--1 0--0--0 '-5 4--4--4 3--3--3 2-- ' 7--7--7 6--6--6 5--5-- ' 0 gameField[noparse][[/noparse]gi+17+5*(gi/11)] := triplet.WORD[noparse][[/noparse] levelPointer.BYTE[noparse][[/noparse] i ] & 7 ] ...
Unfortunately this loads some rubbish tiles and then fills the largest part of the game field with zero-tiles.
The function is called with "1" as parameter.
What I mean is this:
1. set levelPointer to the beginning of the level that should be loaded
2. get the lowest 3 bit of the first byte of the current level
3. use this number to get the right word (a tile plus palette for HEL)
4. write that tile to the screen
I know that the level data, the word table and the calculation where to write the tile to are correct. So I guess that either the masking or the memory access is wrong. Do you see any problem in how I try to read the level data?
Thanks in advance
Post Edited (RedNifre) : 5/19/2007 3:25:41 AM GMT
Comments
each line in the DAT section is written as 33 bytes, but when compiled it gets filled to 36 byte to match the size of longs. Thats why my calculations were off.
Andre'