Comparative Biochemistry and Physiology

Elsevier
Elsevier
ISSN: 0010406X

Are you a researcher?

Create a profile to get free access to personal recommendations for colleagues and new articles.
journal names
Comparative Biochemistry and Physiology
Publications
2 958
Citations
64 169
h-index
89
Top-3 organizations
Monash University
Monash University (10 publications)
Duke University
Duke University (8 publications)
Top-3 countries
USA (323 publications)
United Kingdom (70 publications)
Australia (46 publications)

Most cited in 5 years

Found 
from chars
Publications found: 1648
Optimal approximation of infinite-dimensional holomorphic functions II: recovery from i.i.d. pointwise samples
Adcock B., Dexter N., Moraga S.
Q1
Elsevier
Journal of Complexity 2025 citations by CoLab: 0
Integrability of weak mixed first-order derivatives and convergence rates of scrambled digital nets
Liu Y.
Q1
Elsevier
Journal of Complexity 2025 citations by CoLab: 0
On the Complexity of Orbit Word Problems
Maller M.
Q1
Elsevier
Journal of Complexity 2025 citations by CoLab: 0
Computing Approximate Roots of Monotone Functions
Hollender A., Lawrence C., Segal-Halevi E.
Q1
Elsevier
Journal of Complexity 2025 citations by CoLab: 0
Online outcome weighted learning with general loss functions
Yang A., Fan J., Xiang D.
Q1
Elsevier
Journal of Complexity 2025 citations by CoLab: 0
Factoring sparse polynomials fast
Demin A., van der Hoeven J.
Q1
Elsevier
Journal of Complexity 2025 citations by CoLab: 0
Weighted mesh algorithms for general Markov decision processes: Convergence and tractability
Belomestny D., Schoenmakers J., Zorina V.
Q1
Elsevier
Journal of Complexity 2025 citations by CoLab: 0
Direct estimates for adaptive time-stepping finite element methods
Actis M., Gaspoz F., Morin P., Schneider C., Schneider N.
Q1
Elsevier
Journal of Complexity 2025 citations by CoLab: 0
Succinct obituary in memoriam of Joos Heintz
Pardo L.M.
Q1
Elsevier
Journal of Complexity 2025 citations by CoLab: 0
A revisit on Nesterov acceleration for linear ill-posed problems
Liu D., Huang Q., Jin Q.
Q1
Elsevier
Journal of Complexity 2025 citations by CoLab: 1
A Procedure for Increasing the Convergence Order of Iterative Methods from p to 5p for Solving Nonlinear System
George S., M M., Gopal M., G C., Argyros I.K.
Q1
Elsevier
Journal of Complexity 2025 citations by CoLab: 0
Fast interpolation of multivariate polynomials with sparse exponents
van der Hoeven J., Lecerf G.
Q1
Elsevier
Journal of Complexity 2025 citations by CoLab: 1
No existence of a linear algorithm for the one-dimensional Fourier phase retrieval
Huang M., Xu Z.
Q1
Elsevier
Journal of Complexity 2025 citations by CoLab: 0  |  Abstract
Fourier phase retrieval, which aims to reconstruct a signal from its Fourier magnitude, is of fundamental importance in fields of engineering and science. In this paper, we provide a theoretical understanding of algorithms for the one-dimensional Fourier phase retrieval problem. Specifically, we demonstrate that if an algorithm exists which can reconstruct an arbitrary signal x∈CN in Poly(N)log⁡(1/ϵ) time to reach ϵ-precision from its magnitude of discrete Fourier transform and its initial value x(0), then P=NP. This partially elucidates the phenomenon that, despite the fact that almost all signals are uniquely determined by their Fourier magnitude and the absolute value of their initial value |x(0)|, no algorithm with theoretical guarantees has been proposed in the last few decades. Our proofs employ the result in computational complexity theory that the Product Partition problem is NP-complete in the strong sense.
High probability bounds on AdaGrad for constrained weakly convex optimization
Hong Y., Lin J.
Q1
Elsevier
Journal of Complexity 2025 citations by CoLab: 0  |  Abstract
In this paper, we study the high probability convergence of AdaGrad-Norm for constrained, non-smooth, weakly convex optimization with bounded noise and sub-Gaussian noise cases. We also investigate a more general accelerated gradient descent (AGD) template (Ghadimi and Lan, 2016) encompassing the AdaGrad-Norm, the Nesterov's accelerated gradient descent, and the RSAG (Ghadimi and Lan, 2016) with different parameter choices. We provide a high probability convergence rate O˜(1/T) without knowing the information of the weak convexity parameter and the gradient bound to tune the step-sizes.
Best Paper Award of the Journal of Complexity
Q1
Elsevier
Journal of Complexity 2025 citations by CoLab: 0

Top-100

Citing journals

1000
2000
3000
4000
5000
6000
Show all (70 more)
1000
2000
3000
4000
5000
6000

Citing publishers

5000
10000
15000
20000
25000
30000
35000
Show all (70 more)
5000
10000
15000
20000
25000
30000
35000

Publishing organizations

2
4
6
8
10
Show all (70 more)
2
4
6
8
10

Publishing countries

50
100
150
200
250
300
350
USA, 323, 10.92%
United Kingdom, 70, 2.37%
Australia, 46, 1.56%
Canada, 29, 0.98%
France, 14, 0.47%
Netherlands, 11, 0.37%
Israel, 9, 0.3%
Brazil, 8, 0.27%
India, 7, 0.24%
Italy, 7, 0.24%
Belgium, 6, 0.2%
Sweden, 6, 0.2%
Norway, 5, 0.17%
Poland, 5, 0.17%
USSR, 5, 0.17%
Germany, 4, 0.14%
Spain, 4, 0.14%
Japan, 4, 0.14%
Czechoslovakia, 4, 0.14%
Russia, 3, 0.1%
Mexico, 3, 0.1%
Puerto Rico, 3, 0.1%
Czech Republic, 3, 0.1%
Uganda, 2, 0.07%
Finland, 2, 0.07%
Chile, 2, 0.07%
Ukraine, 1, 0.03%
China, 1, 0.03%
Hungary, 1, 0.03%
Venezuela, 1, 0.03%
Lebanon, 1, 0.03%
Nigeria, 1, 0.03%
New Zealand, 1, 0.03%
Pakistan, 1, 0.03%
Slovakia, 1, 0.03%
South Africa, 1, 0.03%
Show all (6 more)
50
100
150
200
250
300
350