0
TECHNICAL PAPERS

Inertial Effects on Void Growth in Porous Viscoplastic Materials

[+] Author and Article Information
W. Tong, G. Ravichandran

Graduate Aeronautical Laboratories, Division of Engineering and Applied Science, California Institute of Technology, Pasadena, CA 91125

J. Appl. Mech 62(3), 633-639 (Sep 01, 1995) (7 pages) doi:10.1115/1.2895993 History: Received August 09, 1993; Revised March 01, 1994; Online October 30, 2007

Abstract

The present work examines the inertial effects on void growth in viscoplastic materials which have been largely neglected in analyses of dynamic crack growth and spallation phenomena using existing continuum porous material models. The dynamic void growth in porous materials is investigated by analyzing the finite deformation of an elastic/viscoplastic spherical shell under intense hydrostatic tensile loading. Under typical dynamic loading conditions, inertia is found to have a strong stabilizing effect on void growth process and consequently to delay coalescence even when the high rate-sensitivity of materials at very high strain rates is taken into account. Effects of strain hardening and thermal softening are found to be relatively small. Approximate relations are suggested to incorporate inertial effects and rate sensitivity of matrix materials into the porous viscoplastic material constitutive models for dynamic ductile fracture analyses for certain loading conditions.

Copyright © 1995 by The American Society of Mechanical Engineers
Your Session has timed out. Please sign back in to continue.

References

Figures

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In