I am creating a basic tile-based 2D game ~mostly from scratch in Java, however all I need is pseudo-code for how this could be acheived. My problem lies in the fact that my world (stored in a HashMap so I don’t have to have a null object for every position down to 0) may be millions or even billions of tiles large, so I’m sure it is not efficient to loop through every tile. A solution I have thought of to solve this issue would be to try to calculate based on the position of the camera in the world and the size of the tiles how many could fit on the screen in the X and Y axes, and also the offset of them to the screen as well, but I have tried (and failed) to implement this, because I am unsure on how to convert this is Maths/Logic. How may I go about this? Below I have linked an image showing what tiles I intend to show to the screen, where the black square is the centre and tiles that do not cover the screen are not shown. Thanks for your help in advance!
- What are all those camera modes for, anyway?
- Lenovo Mirage Camera review
- How to clean your digital SLR camera
- Fujifilm, Michael Kors fashion a camera for the selfie-centered
- Disney will use Nokia's virtual reality camera to film behind-the-scenes movie extras
- New Nintendo 2DS XL: The Kotaku Review
- World's fastest 2D camera can capture 100 billion frames per second
- The Mechanics Behind Satisfying 2D Jumping
- Apple brings manual camera controls to iOS 8
- Cloud cameras, infinite focus and auto-everything: We forecast the future of imaging
How can I calculate all the tiles visible to a camera in 2D? have 285 words, post on gamedev.stackexchange.com at August 15, 2019. This is cached page on GameMaz. If you want remove this page, please contact us.