Informatsionnye Tekhnologii i Vychslitel'nye Sistemy
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive
Guidelines for authors

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Informatsionnye Tekhnologii i Vychslitel'nye Sistemy:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Informatsionnye Tekhnologii i Vychslitel'nye Sistemy, 2013, Issue 3, Pages 46–54 (Mi itvs125)  

COMPUTER GRAPHICS

Extra-large textures for high realistic terrain visualization

P. Yu. Timokhin, M. V. Mikhaylyuk

Scientific Research Institute for System Analysis of the Russian Academy of Sciences, Moscow
Abstract: The article presents a new real-time technique for extra-large texture rendering in high realistic terrain visualization. The proposed methods and algorithms are GPU-friendly and use geometric shaders and transform feedback mode.
Keywords: high realistic visualization, texture, virtual terrain, real time, GPU.
Document Type: Article
Language: Russian
Citation: P. Yu. Timokhin, M. V. Mikhaylyuk, “Extra-large textures for high realistic terrain visualization”, Informatsionnye Tekhnologii i Vychslitel'nye Sistemy, 2013, no. 3, 46–54
Citation in format AMSBIB
\Bibitem{TimMik13}
\by P.~Yu.~Timokhin, M.~V.~Mikhaylyuk
\paper Extra-large textures for high realistic terrain visualization
\jour Informatsionnye Tekhnologii i Vychslitel'nye Sistemy
\yr 2013
\issue 3
\pages 46--54
\mathnet{http://mi.mathnet.ru/itvs125}
Linking options:
  • https://www.mathnet.ru/eng/itvs125
  • https://www.mathnet.ru/eng/itvs/y2013/i3/p46
  • Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Informatsionnye  Tekhnologii i Vychslitel'nye Sistemy
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2025