Read e-book Steel Forgings: Design, Production, Selection, Testing and Application, Manual 53

Free download. Book file PDF easily for everyone and every device. You can download and read online Steel Forgings: Design, Production, Selection, Testing and Application, Manual 53 file PDF Book only if you are registered here. And also you can download or read online all Book PDF file that related with Steel Forgings: Design, Production, Selection, Testing and Application, Manual 53 book. Happy reading Steel Forgings: Design, Production, Selection, Testing and Application, Manual 53 Bookeveryone. Download file Free Book PDF Steel Forgings: Design, Production, Selection, Testing and Application, Manual 53 at Complete PDF Library. This Book have some digital formats such us :paperbook, ebook, kindle, epub, fb2 and another formats. Here is The CompletePDF Book Library. It's free to register here to get Book file PDF Steel Forgings: Design, Production, Selection, Testing and Application, Manual 53 Pocket Guide.
Product description
Contents:


  1. Steam Traps
  2. Pneumatic Series Casters
  3. Lubrication and wear in forging | Gear Solutions Magazine Your Resource to the Gear Industry
  4. Carrier CCN
  5. Navigation menu

ASTM Standards. Hidden categories: CS1 errors: missing periodical Pages containing links to subscription-only content Incomplete lists from February Namespaces Article Talk. Views Read Edit View history. Languages Add links. By using this site, you agree to the Terms of Use and Privacy Policy. ASTM A1. Standard Specification for Gray Iron Castings. Standard Specification for Steel Track Spikes. Standard Specification for Steel Screw Spikes.

Standard Specification for Ferromanganese. Standard Specification for Ferrochromium. Results are, therefore, highly dependent on experimental conditions. An optimum speed in terms of minimum forging load and best die filling can sometimes be found. In judging the effectiveness of a lubricant, one must keep in mind that entirely different criteria apply to various forging geometries.

In upsetting and ring compression, the predominant variable is end-face expansion, and this is promoted by lubrication. In true closed-die trapped-die forging, the major deformation mode is extrusion, into a narrowing gap when draft angles are used. Interactions among oxides, lubricant, and forging speed can become difficult to separate.

In conventional impression-die forging the extrusion effect is combined with upsetting and lateral extrusion of the flash. The short contact time in the hammer aids die filling and neutralizes the friction effect. Die temperature is a most significant factor, but the effects are complex. Higher die temperatures result in less cooling and thus facilitate material flow, especially in impression- and closed-die forging. If increasing interface temperature results in an earlier breakdown of the lubricant, interface sliding decreases, and less outward expansion is found in ring compression.

Some lubricants fail to wet a hotter die, and friction increases. Effect of Application Method. Even the best lubricant will fail if it is deposited discontinuously. At the same time, excessive coating thickness can lead to lubricant accumulation, unfilled forgings, and poor surface quality. Therefore, controlled deposition is essential. Hand application by swabbing is still practiced but is not satisfactory, particularly with aqueous lubricants.

pamro.base17.io/3910-smartphone-snooping.php

Steam Traps

Automated mechanical methods of application have been developed. A good application system must prevent settling out of solids in the holding tank, ensure reliable and uniform atomization breaking up of the liquid droplets by mechanical means or air pressure, and deliver the fine droplets to the die in a controlled manner. Hand-held spray heads suffice for production at lower rates, but mechanically operated stationary or oscillating spray bars are essential for high production rates.

Spray heads with sufficiently large orifices can be kept open if air is blown through them after each lubricant application. At hot forging temperatures, glass or similar inorganic substances can produce a thick film lubricant. The forging process imposes some special requirements. In both isothermal and nonisothermal forging, any accumulation of lubricant residues in the die cavity results in underfilled forgings. Therefore, the lubricant must be applied to the workpiece only, in the form of a thin coating.

The glass should wet the workpiece in order to follow surface deformation, but it should do so without attacking corroding the die or the workpiece. It should adhere to the workpiece sufficiently to be lifted out with the forging.

Pneumatic Series Casters

If glass adheres to the die, it should allow ejection of the workpiece without excessive force and without long, strong stringers. In nonisothermal forging, the heat-insulating capacity of the lubricant should be high. Isothermal forging temperatures are too high for graphite to survive, and BN serves as a useful parting agent. Lubricant Variables. More relevant is the viscosity at the average of the die and workpiece temperatures. In nonisothermal forging, the workpiece surface temperature tends to drop with a good heat-insulant liquid film, but it can actually rise when the film breaks down and high friction generates heat.

For isothermal forging, it is more meaningful to relate glass viscosity to the flow strength of the material. There is no definite minimum film thickness, but typical values are approximately 0. Excessively thick films lead to surface roughening and glass buildup in the die. Some lubricants other than glass can serve as viscous fluids showing diminishing film thickness on increasing workpiece temperature.

Wetting of the workpiece surface by the glass and protection during preheating are important, but the glass with the best protective properties is not necessarily the best lubricant on steel. The best protection is ensured by preheating in a glass bath, the glass then serving also as a lubricant. Glass is a good heat insulator and reduces cooling rates during transfer from the furnace. Although the ring test has been used to evaluate the frictional conditions in hot forging, other tests have been developed.

The warm hot upsetting sliding test [44] is one such test, and die wear during warm forging can be quite different from that occurring during hot forging [45]. Attempts to mitigate the environmental effects and lubricant disposal issues have been undertaken [46]. International focus on this issue is also a concern for many researchers and many forging facilities, with interest increasing recently [47—51]. The life of hot forging dies ranges from a few hundred to some tens of thousands of parts.

It is short enough to have prompted serious investigations into causes of die wear, especially because die costs account for some 15 percent of total production costs. High die temperature is destructive because the die surface loses strength in a thin layer, which makes it less resistant to abrasion and plastic deformation. Therefore, wear under unlubricated conditions is found to be inversely proportional to die temperature.

Conversely, rapid heating above the transition temperature of the die surface on contact, followed by quenching by the cold backing, leads to the formation of a brittle, hard martensitic layer, which is prone to fatigue but more resistant to abrasive wear. The main variable is die temperature. Die life decreases with increasing weight of the forging, because the higher heat content of a larger piece results in higher die temperatures. Interruptions of the smooth flow of production are harmful, because they increase temperature excursions. Contact dwell and cycle times affect bulk die temperatures and surface temperature gradients in the die.

Contact time increases with the number of blows on a hammer, leading to more severe wear if the hammer is too small for the part. Wear is especially severe in the finishing blows on strong alloys. Stickers are most harmful and result in much- reduced die lives, especially in press forging without an ejector.

Excessive temperature is harmful, but so are excessive temperature excursions. Thermal shock and thus thermal fatigue are minimized by appropriate preheating of the die, preferably with an evenly distributed heat source gas or electric rather than with a localized high-temperature source such as a hot workpiece. The optimum preheating temperature is a function of both die and lubricant composition.

Die configuration, together with lubrication, determine material flow. More complex parts need higher pressure and more blows to fill in a hammer, are more likely to stick, and throw more flash, resulting in increased wear. A wider flash land wears less but at the expense of higher cavity pressure. The surface topography of milled dies has been found to influence the wear of hot forging dies [52]. Of all the factors that influence abrasive wear, the one easiest to understand and quantify is that of die material and hardness.

In general, increasing alloying content and die hardness both tend to increase the resistance of forging die steels to abrasive wear. Low-alloy die steels, such as 6F2, 6G, and 6H1, generally have poor resistance to wear as compared to hot-work die steels, such as H13 and H This difference is because the microstructures of the latter steels are not only inherently more resistant to wear, but they also tend to be more stable at higher temperatures.

Low-alloy steels have far inferior wear resistance as compared to hot-work die steels because alloying led to higher hardness and the ability to retain strength at high die temperatures. The effect of various alloying elements on wear of steels has been studied by various authors. The wear resistance increases with increasing contents of carbon and carbide-forming elements, and the presence of elements that are not carbide-forming in martensitic die steels may even be detrimental [53].

Thus, vanadium and its associated carbides are eight times as effective as tungsten and its carbides in reducing wear. Good wear resistance is obtained when the total alloy content is in excess of 3 percent. Molybdenum has a strong effect on reducing wear, but quantities in excess of 2 percent are not needed. An example showing wear as a function of alloy content Mo wt percent is given in Figure 7 [54].

forging hydrauilc press, free forging press, steel forgings, open die forging press

This observation is an important one since the nickel-base alloys are generally many times the cost of the die steel alloys and are harder to machine. Die hardness is another factor whose influence on abrasive wear is easy to quantify. There are two basic processes involved [56]. The first is the formation of plastically deformed grooves that do not involve metal removal, and the second consists of removal of metal in the form of microscopic chips.

Because chip formation, as in metal cutting, takes place through a shear process, increased metal hardness could be expected to diminish the amount of metal removal via abrasive wear. This trend is exactly as observed experimentally and in production environments. The dependence of wear rate on hardness is greatest for low-alloy die steels such as 6F2 [53]. There is a correlation between hardness and wear of die steels with microstructures different from the typical die steel structure of tempered martensite. It has been found that the isothermal heat treatment of steels to produce lower bainite results in better wear resistance.

The initial increase can probably be attributed to the increase in the amount of scale on the billets, which acts as an abrasive during the die wear process. The effects of lubrication and die temperature on die wear have been interpreted in a variety of often-conflicting ways. This is because lubricants and die temperature influence a lubricity, and hence the amount of metal sliding during forging, b the interface pressure during deformation, and c the heat transfer characteristics between the dies and workpiece during conventional hot forging.

Heat transfer is important not only through its influence on heat absorption into the dies, and thus thermal softening and decreased wear resistance of the dies, but also through its effect on the performance of the die and billet lubricants themselves. Investigations into the effect of lubrication on die wear in simple upsetting have shown that wear is greatly increased when the dies are lubricated versus when they are not [59], as shown in Figure 9. The same phenomenon was also found during upset successive lots of 1, samples each on a flat die in a mechanical press.

In these tests, the amount of wear was greater for the lot involving lubricated compression tests [57]. From these findings, one might conclude that wear increases with lubrication because of increased sliding and that lubrication is detrimental in forging. This point is clarified, however, by calculating the amount of wear for equivalent amounts of metal flow past a given point; lubrication reduces wear by a factor of three when compared to forging without lubrication. Moreover, in closed-die forging, the amount of metal sliding is fixed by die and preform design and not lubrication.

Thus, the amount of sliding over the flash land, where wear is usually greatest, depends on the amount of flash that must be produced and not on the efficiency of the lubricant employed. Because the amount of flash will be roughly the same with or without lubrication, employing lubricants in closed-die forging should reduce abrasive wear of the flash land and other parts of the die cavity. The interaction of lubrication and die temperature effects was demonstrated [60] in upset tests on a high energy rate forming HERF machine.

These tests were run with various bulk die temperatures, dwell times, and cycle times. Results established that die wear after upsetting of 1, billets decreased with increasing die temperature. These results were correlated with decreased amounts of sliding at higher die temperatures due to an increase in the coefficient of friction.

Increasing dwell time increases die chilling. As a result, metal flow is hindered, and die wear is reduced. Increased cycle time time between forgings tends to have the reverse effect of increasing dwell time that is, it increases die wear because of lower coefficients of friction and more sliding. However, these effects have been found to be slight in upset tests conducted in a HERF machine [60]. A striking die wear concern is the generally higher wear experienced by the top die versus the lower die, which is most noticeable in their lubricated upset tests, also shown in Figure 9.

This difference can be attributed to greater chilling on the bottom die because the hot workpiece was placed on it prior to forging. This higher temperature could, therefore, have been expected to lead to greater friction, less sliding, and thus less abrasive wear than the top die experienced. From a practical standpoint, increased production rates in a forge shop may be expected to lead to lower die life. Increased die temperature is often used to increase production rates.

In forging under production conditions, the die surface temperature observed between two consecutive forging blows seems to remain unchanged throughout a production run. During the actual forging operation, the die surface temperatures increase and reach a maximum peak value and decrease again when the dies are separated and the forging is removed.

Therefore, in conducting die wear studies, it is suggested that an ejector be used to remove the part after forging, so that die temperatures do not increase because a forging sticks in the die. The effects of sliding on die wear are also qualitatively well known in forging practice. The most direct method to improve die wear is to employ a die steel that is more resistant to wear, that is, one that is harder and that retains its hardness at high die temperatures [61].

This could mean changing from a low-alloy die steel to a chromium hot-work die steel. The decision to make such a change should be based on the suitability of the new die steel itself in the forging operation and the trade-off between expected increases in die life and increases in material and machining costs. Although the ring test has been used to evaluate the frictional conditions in hot forging, other tests have been developed.

The warm hot upsetting sliding test [44] is one such test, and die wear during warm forging can be quite different from that occurring during hot forging [45]. Attempts to mitigate the environmental effects and lubricant disposal issues have been undertaken [46]. International focus on this issue is also a concern for many researchers and many forging facilities, with interest increasing recently [47—51]. The life of hot forging dies ranges from a few hundred to some tens of thousands of parts.

Lubrication and wear in forging | Gear Solutions Magazine Your Resource to the Gear Industry

It is short enough to have prompted serious investigations into causes of die wear, especially because die costs account for some 15 percent of total production costs. High die temperature is destructive because the die surface loses strength in a thin layer, which makes it less resistant to abrasion and plastic deformation. Therefore, wear under unlubricated conditions is found to be inversely proportional to die temperature. Conversely, rapid heating above the transition temperature of the die surface on contact, followed by quenching by the cold backing, leads to the formation of a brittle, hard martensitic layer, which is prone to fatigue but more resistant to abrasive wear.

The main variable is die temperature. Die life decreases with increasing weight of the forging, because the higher heat content of a larger piece results in higher die temperatures. Interruptions of the smooth flow of production are harmful, because they increase temperature excursions. Contact dwell and cycle times affect bulk die temperatures and surface temperature gradients in the die. Contact time increases with the number of blows on a hammer, leading to more severe wear if the hammer is too small for the part. Wear is especially severe in the finishing blows on strong alloys.

Stickers are most harmful and result in much- reduced die lives, especially in press forging without an ejector. Excessive temperature is harmful, but so are excessive temperature excursions. Thermal shock and thus thermal fatigue are minimized by appropriate preheating of the die, preferably with an evenly distributed heat source gas or electric rather than with a localized high-temperature source such as a hot workpiece.

The optimum preheating temperature is a function of both die and lubricant composition. Die configuration, together with lubrication, determine material flow. More complex parts need higher pressure and more blows to fill in a hammer, are more likely to stick, and throw more flash, resulting in increased wear.

A wider flash land wears less but at the expense of higher cavity pressure. The surface topography of milled dies has been found to influence the wear of hot forging dies [52]. Of all the factors that influence abrasive wear, the one easiest to understand and quantify is that of die material and hardness. In general, increasing alloying content and die hardness both tend to increase the resistance of forging die steels to abrasive wear.

Low-alloy die steels, such as 6F2, 6G, and 6H1, generally have poor resistance to wear as compared to hot-work die steels, such as H13 and H This difference is because the microstructures of the latter steels are not only inherently more resistant to wear, but they also tend to be more stable at higher temperatures. Low-alloy steels have far inferior wear resistance as compared to hot-work die steels because alloying led to higher hardness and the ability to retain strength at high die temperatures. The effect of various alloying elements on wear of steels has been studied by various authors.

The wear resistance increases with increasing contents of carbon and carbide-forming elements, and the presence of elements that are not carbide-forming in martensitic die steels may even be detrimental [53]. Thus, vanadium and its associated carbides are eight times as effective as tungsten and its carbides in reducing wear. Good wear resistance is obtained when the total alloy content is in excess of 3 percent. Molybdenum has a strong effect on reducing wear, but quantities in excess of 2 percent are not needed. An example showing wear as a function of alloy content Mo wt percent is given in Figure 7 [54].

This observation is an important one since the nickel-base alloys are generally many times the cost of the die steel alloys and are harder to machine. Die hardness is another factor whose influence on abrasive wear is easy to quantify. There are two basic processes involved [56]. The first is the formation of plastically deformed grooves that do not involve metal removal, and the second consists of removal of metal in the form of microscopic chips.

Because chip formation, as in metal cutting, takes place through a shear process, increased metal hardness could be expected to diminish the amount of metal removal via abrasive wear. This trend is exactly as observed experimentally and in production environments. The dependence of wear rate on hardness is greatest for low-alloy die steels such as 6F2 [53].

Carrier CCN

There is a correlation between hardness and wear of die steels with microstructures different from the typical die steel structure of tempered martensite. It has been found that the isothermal heat treatment of steels to produce lower bainite results in better wear resistance. The initial increase can probably be attributed to the increase in the amount of scale on the billets, which acts as an abrasive during the die wear process. The effects of lubrication and die temperature on die wear have been interpreted in a variety of often-conflicting ways.

This is because lubricants and die temperature influence a lubricity, and hence the amount of metal sliding during forging, b the interface pressure during deformation, and c the heat transfer characteristics between the dies and workpiece during conventional hot forging. Heat transfer is important not only through its influence on heat absorption into the dies, and thus thermal softening and decreased wear resistance of the dies, but also through its effect on the performance of the die and billet lubricants themselves.

Investigations into the effect of lubrication on die wear in simple upsetting have shown that wear is greatly increased when the dies are lubricated versus when they are not [59], as shown in Figure 9. The same phenomenon was also found during upset successive lots of 1, samples each on a flat die in a mechanical press. In these tests, the amount of wear was greater for the lot involving lubricated compression tests [57].

From these findings, one might conclude that wear increases with lubrication because of increased sliding and that lubrication is detrimental in forging. This point is clarified, however, by calculating the amount of wear for equivalent amounts of metal flow past a given point; lubrication reduces wear by a factor of three when compared to forging without lubrication.

Moreover, in closed-die forging, the amount of metal sliding is fixed by die and preform design and not lubrication. Thus, the amount of sliding over the flash land, where wear is usually greatest, depends on the amount of flash that must be produced and not on the efficiency of the lubricant employed.

Get this edition

Because the amount of flash will be roughly the same with or without lubrication, employing lubricants in closed-die forging should reduce abrasive wear of the flash land and other parts of the die cavity. The interaction of lubrication and die temperature effects was demonstrated [60] in upset tests on a high energy rate forming HERF machine. These tests were run with various bulk die temperatures, dwell times, and cycle times.


  • STEEL FORGING DESIGN, PRODUCTION, SELECTION, TESTING AND APPLICATION | FlipHTML5!
  • Formation and Properties of Clay-Polymer Complexes.
  • Affirmative Action Reconsidered: Was It Necessary in Academia?.
  • Lubrication and wear in forging!

Results established that die wear after upsetting of 1, billets decreased with increasing die temperature. These results were correlated with decreased amounts of sliding at higher die temperatures due to an increase in the coefficient of friction. Increasing dwell time increases die chilling.

As a result, metal flow is hindered, and die wear is reduced. Increased cycle time time between forgings tends to have the reverse effect of increasing dwell time that is, it increases die wear because of lower coefficients of friction and more sliding. However, these effects have been found to be slight in upset tests conducted in a HERF machine [60]. A striking die wear concern is the generally higher wear experienced by the top die versus the lower die, which is most noticeable in their lubricated upset tests, also shown in Figure 9.

This difference can be attributed to greater chilling on the bottom die because the hot workpiece was placed on it prior to forging. This higher temperature could, therefore, have been expected to lead to greater friction, less sliding, and thus less abrasive wear than the top die experienced.

From a practical standpoint, increased production rates in a forge shop may be expected to lead to lower die life. Increased die temperature is often used to increase production rates. In forging under production conditions, the die surface temperature observed between two consecutive forging blows seems to remain unchanged throughout a production run. During the actual forging operation, the die surface temperatures increase and reach a maximum peak value and decrease again when the dies are separated and the forging is removed.

Therefore, in conducting die wear studies, it is suggested that an ejector be used to remove the part after forging, so that die temperatures do not increase because a forging sticks in the die. The effects of sliding on die wear are also qualitatively well known in forging practice.

The most direct method to improve die wear is to employ a die steel that is more resistant to wear, that is, one that is harder and that retains its hardness at high die temperatures [61].

Navigation menu

This could mean changing from a low-alloy die steel to a chromium hot-work die steel. The decision to make such a change should be based on the suitability of the new die steel itself in the forging operation and the trade-off between expected increases in die life and increases in material and machining costs. More recently, computational analyses have resulted in improved simulation capabilities, which allow for improved die design to reduce conditions that are conducive to abrasive wear [62—73], although the details of local pressure, temperature, and friction conditions coupled with the precise material response to these conditions is critical to the ability to accurately simulate any forging process.

Although computational analyses are advancing, experimental evaluations of wear remain an active area of investigation for example [74—77] , and precise mechanisms of wear and local stress, temperature, and friction conditions are often unknown. Coating, hardfacing, and surface treatment of forging dies often can be employed to improve wear resistance as well. Some specific coating and hardfacing alloys and surface treatments such as nitriding and boriding are beneficial.

These include the use of chromium and cobalt-base coatings, weld deposits of higher-alloy steels onto low-alloy steels [78], weld deposits of nickel and cobalt hardfacing alloys on die steels [79], ceramic coatings, physical and chemical vapor deposition coatings [80, 81] and surface nitriding [82]. Studies have been done to show the effect of lubricant and coating on the life of the die [83]. Another means of reducing wear in the forging of steel involves reducing the scale on heated billets; scale acts as an abrasive during the sliding that occurs between the dies and workpiece.

Poor control of scale can reduce die life as much as percent. Methods of reducing scale are relatively obvious and include:. One final means of decreasing the problem of wear is through improved redesign of the blocker shape. This approach is an important consideration because wear is strongly dependent on the amount of sliding that occurs on a die surface.

Thus, it is possible to reduce sliding, thereby reducing wear, by redesigning the blocker shape. Special thanks are extended to Professor John Schey for his outstanding work Tribology in Metalworking: Friction, Lubrication and Wear ASM International, from which information has been extracted and which forms a foundation for this article. The authors also thank R. Shivpuri and S. All rights reserved. This paper was reprinted with the permission of ASM International. The book may be purchased in its entirety at www. Log in to leave a comment. Sign in Join. Sign in.

Log into your account. Sign up. Password recovery. Recover your password. Sunday, September 22, Media kit Contact. Forgot your password? Get help. Gear Solutions magazine. Reclamation and recycling of quench oil. Vibratory media composition vs. What is a gear engineer? Home Features Lubrication and wear in forging. Share on Facebook. Topological Gear Grinding Methods. A Grinding Solution. The Rules for Successful Rotary Broaching. Cyclocut Bevel Gear Production. A New Paradigm in Gear Hobbing. September 15,