%0 Journal Article %T Generalized Bayesian Inference for Regression-Type Models with an Intractable Normalizing Constant %A Qinqin Gan %A Wanzhou Ye %J Advances in Pure Mathematics %P 319-338 %@ 2160-0384 %D 2025 %I Scientific Research Publishing %R 10.4236/apm.2025.155016 %X Regression models with intractable normalizing constants are valuable tools for analyzing complex data structures, yet parameter inference for such models remains highly challenging—particularly when observations are discrete. In statistical inference, discrete state spaces introduce significant computational difficulties, as the normalizing constant often requires summation over extremely large or even infinite sets, which is typically infeasible in practice. These challenges are further compounded when observations are independent but not identically distributed. This paper addresses these issues by developing a novel generalized Bayesian inference approach tailored for regression models with intractable likelihoods. The key idea is to employ a specific form of generalized Fisher divergence to update beliefs about the model parameters, thereby circumventing the need to compute the normalizing constant. The resulting generalized posterior distribution can be sampled using standard computational tools, such as Markov Chain Monte Carlo (MCMC), effectively avoiding the intractability of the normalizing constant. %K Intractable Normalizing Constant %K Fisher Divergence %K Conway-Maxwell-Poisson Regression %U http://www.scirp.org/journal/PaperInformation.aspx?PaperID=142767