Burst switching appears to be the most promising technique for meeting transperancy and bandwidth-on-demand requirements in wavelength division multiplexed, ultra high capacity, backbone optical networks. Modeling and simulation of such networks present challenges which are rather difficult, and in extreme cases impossible to meet using analytical or discrete event approaches. This is due to the absence of sufficiently accurate models for the former, and due to the required exessively large computational resources for the latter approach. In this paper a different modeling approach is presented, which under the assumptions of stationarity of distributions allows for implementation of Monte Carlo sampling for calculating overall burst loss probability performance of JIT protocol in optical burst switched networks. Simulation results are presented for NSFNet and Pan-European Network, which demonstrate the applicability of the proposed approach.