Thickness Effects Are Minor in the Energy-Release Rate Integral for Bent Plates Containing Elliptic Holes or Cracks

[+] Author and Article Information
J. G. Simmonds, J. Duva

Department of Applied Mathematics and Computer Science, University of Virginia, Charlottesville, Va. 22901

J. Appl. Mech 48(2), 320-326 (Jun 01, 1981) (7 pages) doi:10.1115/1.3157616 History: Received August 01, 1980; Online July 21, 2009


The exact value of Sanders’ path-independent, energy-release rate integral I for an infinite, bent elastic slab containing an elliptic hole is shown to be approximated by its value from classical plate theory to within a relative error of O(h/c)F(e), where h is the thickness, c is the semimajor axis of the ellipse, and F is a function of the eccentricity e. This result is based on Golden’veiser’s analysis of three-dimensional edge effects in plates, as developed by van der Heijden. As the elliptic hole approaches a crack, F(e)~In (1−e). However, this limit is physically meaningless, because Golden’veiser’s analysis assumes that h is small compared to the minimum radius of curvature of the ellipse. Using Knowles and Wang’s analysis of the stresses in a cracked plate predicted by Reissner’s theory, we show that the relative error in computing I from classical plate theory is only O(h/c)In (h/c), where c is the semicrack length. Our results suggest that classical plate and shell theories are entirely adequate for predicting crack growth, within the limitations of applying any elastic theory to an inherently inelastic phenomenon.

Copyright © 1981 by ASME
Your Session has timed out. Please sign back in to continue.






Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In