Читать книгу Automotive Machining - Mike Mavrigian - Страница 8

Оглавление

CHAPTER 2


PRECISION MEASUREMENT TOOLS

Precision measurement tools are a necessity for both machining and assembly procedures. In this chapter I discuss common precision measurement tools and systems that are used on a routine basis in any engine machining shop.

Micrometer

A micrometer is a measurement tool available in various formats, including an outside micrometer, used to measure thickness, length, or diameter; an inside micrometer, used to measure inside diameter of a hole; and depth micrometers, used to measure hole depth.

An outside micrometer features a C-clamp style frame. At the far end is a flat-faced anvil. Facing the anvil at the opposite end of the frame is a flat-faced spindle. The anvil is in a fixed position on the frame. The spindle moves as the micrometer is adjusted. The sleeve features incremented index marks and is stationary. The thimble at the far end of the grip rotates and features additional marks. A lock lever is provided to secure the adjustment in a locked position when required.

An inch-format micrometer features an internal screw drive that features 40 threads per inch. One complete revolution of the screw moves the thread 1/40, which is equal to .025 inch. Each mark on the sleeve represents .025 inch. The beveled face of the thimble has 25 equally spaced lines that each represents .001 inch.


The primary components of an outside micrometer are labeled here, including the frame, stationary anvil, sliding spindle, stationary index marked sleeve, and rotating thimble. Micrometers feature a lock that allows you to lock a measurement for reference. This micrometer features a toggle lever lock. A micrometer set typically includes ranges of 0 to 1 inch, 1 to 2 inches, 2 to 3 inches, and 3 to 4 inches. Also included are three calibration standards and a spanner wrench for adjusting micrometer calibration.

When reading a micrometer, first note the larger lines and numbers on the sleeve, and then look at the smaller lines between the numbered lines. As an example, if the number 3 is visible on the sleeve, that means that you are measuring at least .300 inch. Let’s say that three smaller lines after the number 3 are also visible. You now know that the measurement is at least .375 inch (remember that each line indicates .025 inch, so seeing three of these lines indicates .075 inch).

Next, note the lines on the moving thimble to see which thimble line matches up to the last visible line on the sleeve. As an example, let’s say that the thimble line is identified as number 8. Because each thimble line indicates a space of .001 inch, that means that you have an additional .008 inch. Because the previously noted measurement was .375 inch, when you add the spindle sleeve mark of 8, the final measurement is .383 inch.


This sample reading shows the number-2 line flush with the chamfered edge of the thimble, with the zero line of the thimble aligned with the horizontal sleeve line, indicating exactly .200 inch.

A first-time user of a micrometer will likely be intimidated, but with a bit of practice, reading a micrometer becomes much easier.

When measuring a part with a micrometer, adhere to one rule: go slowly. Avoid spinning or twirling the thimble. Work slowly and carefully. Adjust the micrometer to almost the desired opening by rolling the thimble along your hand; avoid twirling it. Place the micrometer onto the part, holding it firmly with one hand. Use your sense of feel to make sure that the axis of the micrometer is perpendicular to the measured surface. Don’t rock the micrometer; use your sense of feel. Close the micrometer using the ratchet knob to close the micrometer until the spindle is almost touching the part. Then gently close the micrometer spindle until the ratchet stop disengages by one click.

Always store a micrometer in a protective case when not in use. These are delicate, precision instruments and require care when handling or storing.

“Digital” micrometers are also available; they feature an easy-to-read measurement number in a display window, eliminating the need to read the traditional mark lines. However, it is absolutely critical that either style must be routinely kept in calibration.


This reading shows one mark on the sleeve greater than the number 2, which translates into .225 inch. The number 2 indicates .200 inch, with one additional line visible at the edge of the thimble. Because each line on the sleeve indicates .025 inch, the measurement shown here is .225 inch.


This digital readout micrometer must be properly calibrated. The reading is displayed in the viewing window. The mic can also be read referencing the marks on the sleeve and thimble. Close-up of a digital mic window. The lock on this model is a knurled rotating knob instead of a toggle lever.


Measuring piston skirt diameters requires having outside micrometers in the range of the pistons that you deal with. When measuring a piston skirt diameter, always measure at the skirt location specified by the piston maker, since a slight taper may exist from the ring area to the skirt bottom.

Calibrating a Micrometer

A micrometer should be checked regularly for calibration because calibration can be affected by wear or damage (dropping it, etc.). To check calibration, first verify that the anvil and spindle surfaces are clean. For a zero-to-1-inch micrometer, insert a clean sheet of paper between the anvil and spindle and close the gap to capture the paper, then pull the paper out. Carefully clean the surfaces but avoid leaving lint on the surfaces. Next, you need a “standard” (also called a checking block), which is a length of steel that has been precision ground to an exact length. Standards are readily available, and many micrometer kits include a set of standards in various lengths. Both the micrometer and standard must be absolutely clean and both must be at room temperature. Using a checking standard, capture the standard between the anvil and spindle. For example, if using a 1-inch standard, and the reading is not exactly 1-inch, the gauge must be recalibrated. Insert the standard onto the micrometer and tighten the micrometer to “feel” (don’t overtighten with force). Make sure that the micrometer’s anvil and spindle are mated squarely onto the standard (not cocked). Engage the micrometer’s lock to prevent it from moving. Hold the knurled thimble with one hand. Insert the spanner wrench (included with the micrometer) into the small hole in the micrometer sleeve. Hold the spanner wrench between the thumb and forefinger of your other hand, with your thumb resting on the sleeve. Rotate the spanner wrench to make the correction. If using a 1-inch standard, adjust until the micrometer reads exactly 1-inch. Unlock the micrometer and verify that your reading is exactly 1 inch when measuring the 1-inch standard.


Standards (measuring block references) are an absolute must, to allow you to routinely check a micrometer for calibration. The three standards, 1-, 2-, and 3-inch lengths, are shown here. These are precision-ground reference blocks and must be handled with care.


Outside micrometers feature a small hole in the sleeve for the use of a spanner wrench, which allows you to adjust the mic for calibration.

Precision measuring instruments must be kept clean. Even the slightest dust particle can alter your reading. Clean the anvil and spindle surfaces prior to every use. After cleaning with a soft pare or towel, blow dust away with your mouth. Don’t use high-pressured shop-compressed air, since the high velocity of compressed air can force dust particles into the tool’s mechanism.


A micrometer stand is highly recommended. This allows you to secure a micrometer in padded jaws, making calibration checks and various measurements without the need to hold the micrometer in one hand while trying to measure a standard or when trying to set up a bore gauge with a micrometer.


With a measuring standard in place, the spanner wrench is used to adjust the mic so that it precisely reads the standard length. Here a mic is being adjusted so that it reads exactly 1 inch while using a 1-inch standard.

Occasionally, apply a drop of precision instrument oil to the spindle where it protrudes from the frame.

Caliper Micrometer

A caliper can be used to measure the outside or the inside dimensions; for instance, the outside diameter (O.D.) of a valvestem or the inside diameter (I.D.) of a hole. Calipers are available in both dial and digital styles. A dial caliper’s gauge usually features dial marks in increments of .001 inch. The major dimension is marked on the caliper’s slide (for instance, 1 inch, 2 inches, etc.) The numbers on the slide that are visible between the caliper’s fixed and movable jaws indicate one hundredth of an inch increments (for example, the number 5 indicates .500 inch). The gauge needle further refines the measurement by showing additional increments of .001 inch. When the gauge needle begins to first move away from zero, this represents additional .001 inch of the reading. For instance, if the number on the slide indicates 5, and the gauge needle indicates 13, the reading is .513 inch.


Shown is how to measure the diameter of a pushrod. This example shows a measurement of .3013 inch. Example of a commonly available dial caliper. A dial caliper is essentially a ruler that adds increased measuring precision with the addition of a dial indicator. A dial caliper is very easy to read. This example shows a setting at .5013 inch. The initial distance is represented on the slide, where the inside of the movable jaw aligns with the marks on the slide. Each individual line on the dial represents .001 inch.


Although a traditional outside micrometer may be used to measure brake rotor thickness, a dedicated rotor thickness caliper is a better choice. The stationary anvil features a precision-ground flat surface, while the sliding arm’s contact features a pointed surface.


The rear tip of a dial caliper can be used to measure depth. Extend the tip outward by rolling the thumbwheel of the caliper, insert into the hole or groove, and gently push downward to bottom the end of the rule body against the top surface, then read the measurement on the gauge.

Dial Indicator

A dial indicator features an incremented gauge and a spring-loaded plunger. When the plunger is pushed in, the gauge reading is reduced, depicting how far the plunger moves.

Dial indicators are used in a variety of applications, which determine what type of mounting fixture is required.

Applications include (but are not limited to) measured brake rotor lateral runout, piston position relative to top-dead-center (TDC) in an engine block, measuring crankshaft thrust, measuring runout on a crankshaft, camshaft, valvestem, pushrod, etc. In each case, the dial indicator gauge must be rigidly mounted to allow only the gauge plunger to move.

Regardless of the application, the method of adjusting and reading the gauge is the same.

Let’s use checking crankshaft thrust/endplay as an example. In the case of an iron engine block, the dial indicator gauge is mounted to a mounting fixture that has a magnetic base (for an aluminum block, a fixture that bolts to an available threaded hole works). Position the dial indicator so that the plunger contacts the face of the crankshaft snout or other available flat surface. The plunger must be parallel to the crankshaft centerline and not at an angle relative to the crank. Adjust the dial indicator so that the plunger contacts the surface and creates a slight preload (about .050 inch or so). The preload is vital to make sure that the plunger remains in loaded contact with the crank at all times. Using a lever such as a screwdriver, between a main cap and crankshaft counterweight, move the crankshaft rearward as far as it will move and remove the screwdriver. Rotate the gauge dial to allow the gauge needle to read exactly zero. Then use your lever to move the crankshaft fully forward, noting the distance of movement on the gauge. Repeat this reading several times. Move the crank fully forward and re-zero the gauge, then move the crank rearward (the same reading should be seen). Typically, depending on the application, you may see .004 to .008 inch or so of crank fore/aft movement.

The use of a dial indicator is the same for all applications, whether you’re measuring for movement or runout. For example, if you’re checking pushrods for runout (warp), a special pushrod runout stand is ideal. Lay the pushrod onto the stand and position the dial indicator at the center of the pushrod. Adjust the dial indicator with a bit of preload. Slowly rotate the pushrod until the indicator gauge reads maximum or minimum. Then zero the gauge dial and slowly rotate the pushrod, noting how far from zero the needle moves. This represents the amount of runout. Generally speaking, a maximum of .0005 to .001 inch is acceptable.

When checking a brake rotor disc for runout, the dial indicator needs to be mounted rigidly to prevent the gauge housing from moving. Several methods are available including a mount with a magnetic base and adjustable-length rods to position the gauge, or a flexible/locking arm that secures to a solid area such as a strut bracket (the gauge must be mounted to a component that does not move independently, relative to the rotor). Place the dial indicator plunger 90 degrees to the rotor (straight onto the disc surface, not at an angle). The plunger should contact the rotor surface about 1 inch inboard from the disc edge. Adjust the indicator with a bit of preload (about .050 inch), and slowly rotate the rotor until the gauge reads at a minimum point. Zero the gauge needle and slowly rotate the rotor a full 360 degrees, watching the gauge needle. The maximum gauge reading relative to zero represents the amount of rotor runout. Always refer to the manufacturer’s runout specifications, but generally speaking, a maximum of about .002 inch should be acceptable.


A pushrod runout checker features a stand and a dial indicator. Position the pushrod on the stand cradles. Adjust the dial indicator so that the indicator plunger contacts the center of the pushrod radius, and adjust the indicator to create approximately .050 inch of preload, then zero the indicator gauge. Slowly rotate the pushrod and monitor the indicator to measure any pushrod runout. This type of checking tool makes it easy to measure each pushrod for runout/bend. Generally speaking, runout in excess of .001 inch is unacceptable.


A magnetic-base dial indicator can be used to measure a variety of dimensions. In this example, the magnetic base is secured to an iron block deck with the indicator plunger contacting a piston dome, which is one method of checking piston location at TDC (top dead center).


Shown here is a dial indicator being used to measure installed valve depth relative to the deck on a racing cylinder head. A bridge fixture rests on the head deck, with the dial indicator mounted to the bridge. This allows checking to measure each valve’s depth, so that the seats can be machined to obtain an identical depth to optimize performance.


Here, a magnetic base dial indicator is used to measure crankshaft thrust. The magnet must mate to a clean, flat surface for proper stability. The indicator is adjusted on the tool’s extension rod to locate the indicator plunger onto the crankshaft snout. The magnetic base is secured to the iron engine block.


Here, a dial indicator plunger is set up to contact the front face of the crank snout. Regardless of the point of contact, the indicator plunger must be set up parallel to the crankshaft centerline, not at an off-angle. If the plunger is at an improper angle, the reading will not be correct. Set the indicator up so that the plunger contacts a flat area of the crank, and push the indicator a bit toward the crank to obtain a gauge preload of about .050 inch. Push the crank fully rearward and zero the gauge. Then push the crank forward and observe the amount of travel. The close-up of this dial indicator shows that the crankshaft thrust measures a tad greater than .006 inch. The gauge needle has traveled just past the .006-inch mark. The small dial seen near the gauge center indicates additional travel in .0001-inch increments. This crank thrust reading is approximately .0067 inch.


If you’re dealing with an aluminum engine block, you may be able to secure the magnetic base to the face of a steel or iron number-1 main cap. Otherwise, a fixture that bolts to the engine block is needed to mount the dial indicator. In this example, an extension has been added to the indicator plunger, as the plunger contacts the crank’s snout base.

When using a dial indicator to measure brake rotor runout, the best style of indicator for this task is one that features a ball bearing tip on the plunger. This reduces the variable of contact chatter and provides a smoother, more consistent reading. Dial indicators are available with either a solid or roller tip.

Dial Bore Gauge

A dial bore gauge allows you to measure bore diameter in areas such as main bearing bores, cylinder bores, lifter bores, cam bores, etc. A bore gauge kit includes an array of various length anvil extensions that mount to the tool depending on the bore diameter range that you plan to measure.

Keep in mind that a bore gauge does not allow you to measure a bore diameter directly. The gauge is first set up to the target diameter and then zeroed. The gauge allows you to read how close the bore is to the target (at target, undersize or oversize, relative to the target diameter).


Shown here is a typical bore gauge set. The tool includes a gauge and a selection of extensions and spacer washers to allow obtaining a variety of specific bore diameter applications.

To set the gauge up, you need a micrometer. First, set the micrometer to the target bore size. For example, if the bore at hand is supposed to be 2.000 inches, set the micrometer at 2.000 inches.

Then, set up the dial bore gauge with the proper anvil extension that is able to accommodate a 2.000-inch bore. Bore gauge kits include a selection of extensions and spacer washers to allow you to adjust the gauge to the specified bore diameter.

Place the bore gauge between the micrometer’s spindle and anvil.

Gently rock the dial bore gauge back and forth as well as side to side in the micrometer. Zero the bore gauge dial to zero at the minimum reading found while in the micrometer.

When the bore gauge is then inserted into the bore to be measured, you are able to see if the bore diameter based on the reference in the gauge dial. When measuring a bore, rock the bore gauge back and forth as well as side to side to obtain a precise reading. When the gauge needle reaches a stopping point and then starts to move the opposite direction, rock the gauge to reach the point where the needle stops, immediately before it starts to move in the opposite direction. For example, if the bore reading is .002 inch less than the zero mark, the bore is .002 inch too small. If the dial gauge reads .0017 inch greater than the dial’s zero mark, you know that the bore is currently .0017 inch too large. When you know what the bore diameter is, you can then correct by enlarging the bore by boring or honing. If the bore is too large, you can correct (depending on the specific application) by sleeving a cylinder and then honing to size, installing a bushing in a lifter bore and honing to size, or by moving to larger-diameter pistons or lifters, etc.).


Prior to and during align honing a block’s main bore, a bore gauge is used to monitor bore diameter. Here a measurement check is made at twelve and six o’clock. Main bore measurement is also checked at three and nine o’clock and diagonally to check not only diameter but also to check for runout.

In essence, you’re setting up the dial bore gauge for a target bore diameter, then using the gauge to see how much the existing bore deviates from the target diameter.

Setup Tip

Although setting up a bore gauge sounds simple enough, the tricky part is in obtaining a precise point in the micrometer. Basically, you need three hands, because it’s difficult to keep the gauge properly centered on the micrometer while holding each tool by hand. You can mount the micrometer on a bench vise, leaving both hands free to manipulate the gauge, but that gets tricky, because overtightening the bench vise even slightly can cause the micrometer to distort.

A better solution is to use a specialty tool that safely secures the micrometer. An example is Goodson’s Micrometer Stand (MIC-FIX). This serves as a third hand, freeing both hands to hold the bore gauge. This stand features rubber jaws that secure the micrometer without damaging it.

Another option is to secure the bore gauge itself with a dial bore gauge setting fixture. Goodson’s bore gauge setting fixture accepts most popular gauges including Mitutoyo, Phase II, Fowler, Peacock, and others. This fixture does not work with a Sunnen bore gauge, but Sunnen offers an appropriate fixture for its gauges.

Mounting a Bore Gauge with a Setting Fixture

Instead of using a C-frame style micrometer, a bore gauge can be set up using a dial bore gauge setting fixture, such as Goodson’s DBG-FIX. This heavy-solid milled steel base accepts a standard (three are included, one for the 2- to 3-inch range, 3- to 4-inch range, and 4- to 5-inch range). First adjust the end opposite the adjustable spindle to provide a secure pocket for the end of the bore gauge. An anvil is featured against which one side of the bore gauge rests. Two opposing adjustable set screws, one on each side, are then adjusted to capture the sides of the bore gauge. Adjust the two opposing set screws equidistant from the center anvil. Don’t tighten these set screws against the bore gauge. Rather, adjust the set screws to provide .005- to .10-inch clearance. This provides proper alignment of the bore gauge when laced in the fixture, which in turn provides proper alignment to the fixture’s spindle and standard.


Although a bore gauge can be set up by using a micrometer (with the micrometer adjusted to the desired bore diameter), a bore gauge setting fixture eases the task, eliminating the need to hold an outside micrometer. A selection of standards are provided for the 2- to 3-inch range, 3- to 4-inch range, and 4- to 5-inch range, with final adjustment via the built-in micrometer. The bore gauge rests at the pocket end of the setting fixture. The threaded set screws on either side are adjusted an equal distance from the center anvil seen here. This centers the gauge. Do not adjust the set screws tightly against the gauge. Allow .005- to .010-inch clearance at each side of the gauge.


Insert a standard to the fixture, based on the target bore diameter. Adjust the mic thimble to achieve your precise target diameter. When the fixture is set up, you’re ready to adjust your bore gauge


Choose the bore gauge extension that’s appropriate for your target bore diameter. The short end seen here inserts into the gauge. To alter the length of the extension as needed, install a supplied spacer washer to the short end of the extension. Here I installed a .100-inch-thick washer. Gauge kits commonly include spacers at .100-, .050-, and .020-inch thick.

Install the appropriate standard (for example, if your target for the bore gauge needs to be set up to check a 3.500-inch bore, install the 2-inch standard that allows setting to a 3- to 4-inch range. Then adjust the fixture’s micrometer spindle to obtain a fixed setting at 3.500-inch.

Set up your bore gauge with the appropriate extension and washers for your desired bore diameter. Place the bore gauge in the setting stand. If it is too long to fit, change to a shorter extension or fewer washers. If too short, go with a longer extension or add washers as needed. Adjust the bore gauge so that you have a bit of preload on the gauge when placed in the fixture. Rock the bore gauge toward and away from the fixture’s micrometer. With the gauge held at its minimum reading, adjust the gauge bezel to obtain zero on the gauge. The bore gauge is now ready to check the target bore.


Insert the extension to the gauge. Install the threaded collar onto the gauge to secure the extension. The side of the gauge opposite the extension features two rollers that guide the gauge in the bore. The center of the gauge features a spring-loaded plunger that presses against the bore wall. As the plunger is depressed or extended, this movement is seen on the tool’s dial indicator.


A valveguide bore gauge is required to measure the inside diameter of a guide. Measurements should be taken at the bottom, middle, and top areas of each guide. Here a split-ball gauge is inserted into the guide. The gauge is allowed to relax and expand inside the guide, then tightened to maintain the diameter.


With the bore gauge installed onto the setting fixture, gently rock the gauge forward and rearward, noting the minimum indicator reading. Holding the gauge steady at this minimum reading, zero the dial indicator’s bezel so that the needle rests at the zero mark. You now have a zero reference for your target bore diameter. Insert the gauge into the bore (the bore must be clean and free of oil and contamination) and gently rock the gauge in opposite directions, noting the minimum needle reading (where the needle runs to a point and begins to reverse direction). Note the reading on the dial indicator to see how far the bore measures, under or over your target diameter, indicating how far the bore is undersized or oversized relative to the target diameter.

Depth Gauge

A depth gauge is a micrometer-style tool that features a flat base and allows the tool’s spindle to protrude through the flat base. This gauge allows you to determine the depth of a hole, recess, slot, keyway, etc. Securely rest the flat surface against the flat surface with the spindle hole facing the deeper surface to be measured. Rotate the spindle until it gently contacts the surface at the bottom of the hole or recess. Read the micrometer, revealing the difference in height from the fixed surface to the other surface.


A depth gauge allows you to measure the depth of a hole, groove, recess, etc.


A depth gauge features a spindle that protrudes out of the flat base. Make sure that the base and spindle face are clean of any oil, dust, etc.

Internal Caliper Gauge

An internal caliper gauge features two arms with small contact tips on the outside of each arm tip. This tool allows you to measure the inside diameter of a hole that is otherwise difficult to access. Compress the arms together, insert them into the hole or recess, and release the arms. The tool features a lock that may be engaged before removing the tool from the hole. The gauge reveals the diameter. This type of gauge is available with either a needle dial gauge or a digital readout.

Measuring Pushrod Length

Pushrod length does not necessarily reflect overall length, depending on the pushrod style. You must consider the oil orifice at the end (if an oil hole is featured). If you measured an existing pushrod using a large caliper, you’d be fooled by about .017 inch or so (on the short side) because you’d contact the caliper against the small flat at the end that results from the hole opening. If you’re using a pushrod checker, this isn’t an issue, unless you’re also using an existing pushrod that you measured as a reference.


A digital inside caliper allows you to easily measure the inside diameter of a hole. The arms are spring-loaded and the measurement readout is displayed in the viewing window.

Compress the arms together to enter the hole, then release tension. The arms expand away from each other as the arm tips contact the walls. This type of gauge is very easy to use and is extremely accurate.

Also, if the pushrod style you intend to use features a top cup (female pocket instead of male radius), you need to remember that your measurement must take place at the female radius seat, not at the outer edge of the cup. The easiest way to accurately measure a cup-equipped pushrod is to drop a steel ball into the cup, take your measurement, then subtract the ball diameter. For instance, if you’re dealing with a 5/16-inch cupped pushrod, place a 5/16-inch-diameter steel ball into the cup. Then you can measure overall length (contacting your caliper on the ball). Then subtract the actual ball diameter from this measurement, which provides an accurate length to the radiused seat in the pushrod cup.

If you use a ball to help measure a cupped pushrod, take the time to actually measure the ball diameter using a caliper. Don’t just assume that the ball meets an advertised diameter. A 5/16-inch ball should measure .3125-inch in diameter.

Just remember that pushrod lengths refer to the distance between the radiused contact points at each end of the pushrod.

Before you can measure pushrod length, the cylinder head must be fully installed; that means that the head gasket must be in place, and the head fasteners must be fully tightened to their final value.

Verify that clearance between the edge of the valvespring retainer and the underside of the rocker is at least .040 inch.


A long-slide caliper can be used to measure pushrod length, or when measuring an adjustable checking pushrod when determining a custom pushrod length. If a cup-style pushrod is being measured, install a ball bearing into the cup during measurement, then subtract the diameter of the ball to determine the required length of the pushrod, from lower radiused tip to the seat of the upper cup. The ball should be sized appropriately for the radius of the cup (for example, 5/16 inch in diameter, or .3125 inch).


Checking pushrods are available in a wide variety of length ranges and styles.

When measuring for pushrod length, you must use the valvetrain parts that will be used in the final assembly, including block, heads, camshaft, lifters, valves, rocker studs, and rockers. When you’re ready to measure for pushrod length, the only remaining variable should be the length of the pushrods.


When measuring for pushrod length, swap out a valvespring for a light checking spring. This makes it easier and more precise to rotate the crank and eliminates the tension that a valvespring creates. A heavy valvespring can easily damage an adjustable checking pushrod.

Valvespring Height Gauge

A barrel-style micrometer is available designed specifically to measure valvespring installed height. Obtaining adequate valveseat pressure while avoiding coil bind is critical, especially for extreme valvespring pressures and radical camshafts. Although valvesprings should be tested on a valvespring gauge for seat and full-open pressure, the engine builder needs to be sure that the installed height dimension that’s required can be duplicated during assembly. If the measured spring height differs from the target installed spring height, the height can be corrected by adding a shim to the spring seat or by milling the spring seat base.


A valvespring checker allows you to monitor both spring height and pressure. As you compress the spring to its specified installed height (monitored on a separate indicator), spring pressure is monitored on the tool’s pressure gauge.

Valveseat Runout Gauge

Checking the valveguide centerline relative to a valveseat is easily accomplished with a valveseat runout gauge. This gauge features an anvil rod that is inserted into the valveguide. The gauge features an adjustable contact that runs along the valveseat and a dial gauge. Slowly rotating the gauge along the valveseat reveals any runout of the seat relative to the guide. If runout is found, the seat can be resurfaced (or replaced and machined) to obtain zero runout.


A valvespring height micrometer is used to measure installed spring height on the cylinder head. The spring height checker is installed in place of a valvespring, with retainer and keepers installed. Adjusting the gauge to remove all slack with the valve in its fully closed position allows you to measure installed spring height. If this measured installed height is too tall for the specified installed height, shims may be added to the spring seat to compensate, or if too short, the spring seat may be machined to remove material.

Precision Straightedge

When checking a block, cylinder head, or manifold deck surface for flatness/warpage, a precision machinist’s straightedge is an absolute must. These tools are precision-ground steel straightedges that allow you to check for deck warpage using a feeler gauge. Do not rely on just any “straight” ruler or piece of scrap steel for this procedure. Invest in a dedicated precision straightedge. These are available in various lengths and should be treated with care to avoid nicking the edges or allowing the bare steel precision-ground edge surface to rust. Store in a safe location when not in use.


A precision machinist’s straightedge is an essential tool for checking deck surfaces for warpage. The straightedge is positioned front-to-rear on a deck, while a feeler gauge is used to insert between the deck and straightedge. Acceptable warpage limits vary depending on the length and type of cylinder head, intake manifold, or block deck, so referring to published specifications is needed.

Rod Bolt Stretch Gauge

Published torque specifications aside, race engine builders have long realized that the correct approach to tightening connecting rod bolts is to stress the bolts into their “working” range of stress, but not beyond. Because OEM connecting rod bolts may vary in terms of their ideal torque by as much as 10 ft-lbs from batch to batch due to variations in heat treating and materials, if the concern is to arrive at both peak bolt strength as well as maintaining concentricity of the rod’s big end, the rod bolts should be measured for stretch instead of simply tightening until the torque wrench hits its mark.

In simple terms, to measure bolt stretch, first measure the total rod bolt length (from the head surface to the tip of the shank) in the bolt’s relaxed state. Then measure the bolt again after the nut has been tightened to value.

The difference in length indicates the amount of stretch the bolt experiences in its installed state. For the majority of production rod bolts, stretch is likely in the .0045- to .006-inch range. If the stretch is less, the bolt is probably experiencing too much friction that is preventing the proper stretch (requiring lubricant on the threads). If stretch is excessive, the bolt may have been pulled beyond its yield point and is no longer serviceable.


Connecting rod bolt stretch gauges are preferred tools for many performance engine builders. This type of gauge allows you to precisely measure how far a rod bolt has stretched while under installed torque. Shown here are three manufacturers’ examples.


A rod bolt stretch gauge features a stationary anvil (seen here at the bottom of the tool frame) and a spring-loaded dial caliper spindle.

Although an outside micrometer may be used to measure the rod bolt length, the most accurate method is to use a specialty fixture that is outfitted with a dial indicator. Excellent examples of this gauge include units from GearHead Tools, ARP, and Goodson Shop Supplies. GearHead’s bolt stretch gauge features a heat-treated aluminum frame (with a very handy thumbhole) with a specially modified dial indicator with sufficient spring tension to hold the gauge firmly to the ends of the rod bolt. The indicator can be rotated for right- or left-hand operation, and the lower anvil is adjustable to accommodate various bolt lengths. Goodson Shop Supplies also offers a rod bolt stretch gauge, P/N RBG-4, featuring spherical points for consistent and repeatable readings, and can also be rotated for right- or left-hand operation. Also, ARP offers its own bolt stretch gauge, P/N 100-9941, designed with .0005-inch increments, with a heavy spring and ball tips.


Prior to installing a rod bolt, the bolt is installed to the tool, with a bit of preload on the dial gauge, followed by zeroing the dial gauge. This provides a reference length for the bolt in its relaxed state. The anvil and spindle engage into dimples in the rod bolt head and shank tip.

There is a debate among some engine builders regarding the validity of measuring rod bolt stretch, due to potential compression of the rod material as the rod cap is clamped to the rod. Although this can occur, the use of a stretch gauge remains the best practical method of accurately determining bolt load.

Connecting rod bolts can be viewed as high-tensile springs. The bolt must be stretched short of its yield point in order for accurate, and most important, repeatable, clamping of the rod cap to the rod. Improper or unequal bolt clamping force can easily result in a nonround rod bore.

Stock, or production, rod bolts typically offer a tensile strength of approximately 150,000 to 160,000 psi. However, due to variances in bolt production, tolerances can be quite extreme, with peak bolt stretch occurring anywhere from, say, .003 to .006 inch. If the installer uses only torque in the attempt to achieve bolt stretch, he runs the risk of unequal rod bolt clamping loads, due to the potential inconsistencies between bolts.

High-performance rod bolts are manufactured to much tighter tensile strength tolerances. ARP, for instance, calculates each and every rod bolt for stretch, and the bolt packages include reference data to that effect. The instructions actually recommend that a specific amount of bolt stretch should be achieved on each bolt (ARP cites 190,000 psi as its nominal or base tensile rating, with actual ratings much higher in some applications).

How can unequal/inadequate rod bolt tightness affect the connecting rod big end bore shape? Let me cite an example: If one technician reconditions the connecting rods using torque value alone to tighten the rod bolts, and another technician who is responsible for final assembly uses the bolt stretch method, the final result can be out-of-round bores. This is because of frictional variances. As a result, the assembler using the stretch method may achieve a higher clamping load on one or more bolts as compared to the loads imposed when the rod reconditioner torqued the nuts without regard to actual bolt stretch. When a bolt is tightened with dry threads, as much as 80 percent of the torque can be exerted because of friction, as opposed to bolt stretch.

In a high-volume production rebuilding facility, technicians may not have the time to measure for bolt stretch. However, a slower-paced operation that is attempting to obtain maximum accuracy (for a race engine, as an example) is far better off using the stretch method instead of relying only on the torque method.

A set of connecting rod bolts’ instructions may list both a torque value and a stretch range, effectively giving you a choice of methods. Yes, tightening only to a specified torque value is quicker, and measuring bolt stretch requires more time, but the best results are achieved by measuring bolt stretch. So, unless you’re in a rush, take the time to measure stretch, tightening each rod bolt to the recommended stretch range. It’s all about the quest for precision.

Connecting rod bolt tightening is an absolutely critical aspect of rod installation, to achieve the proper amount of rod bearing crush (contact force between the upper and lower bearing shells that serves to properly secure the bearings to the rod big end bore) and to obtain the correct level of rod bolt stretch and clamping force. Undertightened rod bolts don’t provide enough clamping force, and overtightened rod bolts can result in stretching the bolts beyond their elastic range. Either scenario can easily result in rod bolt failure, which in turn results in rod bearing failure and the very real potential for severe damage, including broken and/or twisted rods, damaged or broken crankshaft, broken camshaft, and rods busting through the block.

I cannot overstate the importance of connecting rod bolt tightening. Improper bolt installation results in a ticking time bomb, just waiting to destroy your engine and your wallet.

With regard to tightening rod bolts, there are three potential methods to consider: torque-plus-angle, torque alone, or tightening by monitoring bolt stretch. The only application for tightening rod bolts with torque followed by angle tightening applies to OEM rod bolts that are specified for this type of installation. Any serious builder of a performance engine most likely opts for high-performance aftermarket rod bolts. Generally speaking, if the engine is intended to produce about 450 hp or more, I strongly advise using high-performance rod bolts instead of the OEM bolts. In this book, I focus on the use of these superior quality bolts (as offered by such firms as ARP and others).

When using aftermarket performance rod bolts (or whenever you purchase a set of performance aftermarket connecting rods that include these bolts), bolt tightening instructions will be included, and it’s imperative that you follow the instructions.

Based on the rod bolt diameter, bolt grade, length, and application, the maker provides both a torque value and a target bolt stretch value. You can then decide which method to use. Some builders prefer to follow the torque spec; others prefer to measure bolt stretch. I always tighten by measuring stretch, because I feel that this provides a more accurate means of achieving the desired clamping load, as well as obtaining equal clamping loads on all of the rod bolts.

Torque-Plus-Angle

Many late-model OEM engines specify a torque/angle method for a variety of fasteners, including crank pulley bolts, rod bolts, main cap bolts, and cylinder head bolts. An initial torque value is achieved (obviously using a properly calibrated torque wrench). This establishes a specific and initial level of clamping force. Final tightening takes place by continuing to rotate the bolt head by a specified number of degrees. This method, developed by OEM engineers, theoretically eliminates the variable encountered with regard to thread friction. Engineering research has determined that by continuing to tighten the bolt by a certain number of degrees stretches the bolt into its desired range of elasticity for optimum clamping force.

Degree tightening can be accomplished by several different approaches, including placing a dot on the bolt head and observing how far the bolt head is rotated (for example, by 45 degrees, 90 degrees, etc.). This is a crude method, because it relies on your estimation of degree travel. An inexpensive tool that aids in angle tightening is a small degree wheel that is attached to a wrench. The wheel features degree increments and an adjustable needle to establish your zero mark and is observed as the bolt is rotated. The downside is that this requires using your torque wrench for initial tightening, setting the torque wrench aside, grabbing a wrench that’s equipped with the degree wheel, and continuing to tighten, while carefully observing the travel.

Another approach involves the use of a digital combination torque/angle wrench. This eliminates the need to switch tools in midstream. By pressing a button, you choose your torque format (ft-lbs or Nm) and your torque value (let’s say 35 ft-lbs for example). When the selected torque value is approached, the tool begins to beep and/or LED lights illuminate to let you know that you’re getting close. After the selected value is reached, the tool provides an audible alert, as well as the illumination of a final red light and a vibration in the tool handle (the types of alerts may vary among tool brands and models).

After the bolt is torqued, you simply press another button to switch to the angle mode, select the degree, and continue to tighten. The same alerts take place during the angle tightening phase. Depending on the tool model, you can even ratchet the tool during angle tightening without losing the angle reference (Snap-on’s TechAngle wrench is an example of this). A number of leading precision tool makers now offer these digital wrenches, including Snap-on, Mac, and others. Granted, they’re a bit pricey, but they work well and eliminate the need to use multiple tools. If you’re using performance aftermarket connecting rods and high-performance rod bolts, a torque-plus-angle approach is not required.

Ultrasonic Thickness Gauge

A sonic thickness gauge uses frequency bounce-back signals (similar to sonar) to measure material thickness. A thickness gauge is most commonly used to measure an engine block’s cylinder wall thickness, prior to overboring, to determine existing wall thickness and to make sure that no areas of a cylinder wall are too thin after the cylinder is bored. Potential thin areas include those adjacent to cooling passages. The sonic gauge is first set up using a checking standard, which is a sample piece of metal that is similar to that of the engine block (since iron castings can vary depending on the makeup of the iron). Some gauge kits will include sample standards of a marked thickness. The gauge is calibrated using the appropriate sample. The gauge probe is then lightly coated with a special grease that promotes a good signal. The probe is then put into contact with the cylinder wall, while monitoring the thickness readout. Cylinder walls should be checked from top to bottom at various clock positions, making note of the thinnest reading. Minimum acceptable cylinder wall thickness may vary depending on the specific type of engine block, but generally speaking, a final minimum of about .200 inch should be acceptable. If a specific block is specified for a minimum of, say, .220 inch, and the cylinder wall’s thinnest area currently measures .223 inch, for example, this tells you that a maximum of .003 inch should be removed during boring and honing. A cylinder wall that is considered too thin may be a candidate for sleeving.


After calibrating the sonic checker for the type of block material (some sonic checkers include sample standards of various iron compositions; if not, a section of accessible and easy-to-measure area of the block should be measured with a mic or caliper to calibrate the tool), the tool’s probe is placed against the cylinder wall, with a light coating of supplied grease that permits accurate measurement.

Cylinder walls should be measured for thickness at top, halfway down the bore, and bottom, at a variety of clock positions. Although engine block designs vary in terms of cylinder wall thickness, generally speaking, a minimum of about .200 inch should be acceptable.

Torque Wrenches

The use of a torque wrench allows us to apply a specific amount of rotational force (torque) to a bolt or nut. Torque wrenches are available that provide adjustments in formats including ounce-inch, foot-pounds (ft-lbs), inch-pounds (in-bs), or Newton-meters (Nm). Torque wrenches are available in various designs, including flex-bar style (sometimes called a scale type), dial indicator style, and the common micrometer style (often referred to as a “click” style), as well as digital styles.

With regard to the “click” type micrometer style, “release” models are also available that release upon reaching the preset torque (preventing overtightening), but may not provide an audible click. The release/click type wrench is adjusted by means of a micrometer scale on the handle.

If the torque wrench releases momentarily and/or clicks, this is referred to as a “signal” type. The “indicator” type refers to the visual display units, such as the flex bar or dial indicator style. Newer digital torque wrenches may provide an audible “beep” signal when the setting has been reached, and some may feature both a beep and a vibration when the adjusted level has been reached. The grip vibration feature is helpful in a noisy shop environment. Admittedly, this can be somewhat confusing, since there are so many different types available.


Example of a digital torque/angle wrench. The tool shown here is a Snap-on brand, but other torque wrench manufacturers are also now offering similar technology. Torque is monitored visually via a display window, and an audible beep sounds when the desired torque is achieved. Other models feature a series of green LED lights to alert you when you are approaching the target torque value, with a red light illuminating when the target is finally achieved. A torque/angle wrench allows you to both tighten by torque value and to rotate a fastener by monitoring the tightening angle. Using the Snap-on Tech Angle wrench as an example, press a mode button to enter the torque mode and press the value button until the desired torque is displayed.

Metric scale torque wrenches are available in Newton meters (Nm), meter kilograms (mKg), and centimeter kilograms (cmKg), with Nm being the more common scale. Many torque wrenches provide dual scales for reading in either English or metric formats (for example, a dual scale may offer both ft-lbs and Nm).

Torque wrenches are precision instruments and should be handled as such. Care and storage of a torque wrench is critical in terms of maintaining calibration. Any adjustable torque wrench (the commonly used micrometer-handled click type, for example) should be set at its lowest torque reading when not in use. This is something that many technicians commonly forget. If left stored at a high-torque setting, the calibration may be affected over a long term. When you’re done with the wrench, readjust it to the minimum setting before storing it in the toolbox.


By entering the angle mode, you set the desired angle. When the set angle is achieved, the wrench alerts you via a beep and a vibration in the grip. This is a handy feature for applications where you need to meet OEM torque-plus-angle tightening.

When using an adjustable torque wrench, be careful not to overtighten by applying torque past the release or signal point. With a ratcheting “click” type, the “click” may not be heard at low torque settings, especially in a noisy shop. It’s best to become familiar with the “feel” of the release, rather than relying only on the sound of a click.

When using an indicating type torque wrench (such as a flex bar or dial indicator type), try to read the indicator while viewing it at 90 degrees to its surface. Reading the indicator at an off-angle provides errors due to improper line of sight.

Most torque wrenches operate accurately only when held by the center of their handle grips. Don’t use cheater bars to extend your grip farther away from the wrench head, and don’t grab the handle closer to the wrench head. Only grip the wrench by its designated grip area.

Torque Values and Fastener Clamping

The torque applied to a bolt or stud creates clamping force by stretch-loading, which can be loosely compared to the stretch of a rubber band. When the underside of the bolt head (or nut) makes contact with the parent surface, the additional rotation of the bolt head or nut causes the bolt shank or stud to begin to stretch. The objective is to reach the ideal point where this stretch provides the needed clamping force to properly secure the component being installed. When tightened properly (to specification), the fastener has stretched within its designed elastic range.

When the fastener is loosened, the elasticity allows the shank to return to its normal, uninstalled length. If stretched to its yield point, it can permanently weaken. If the bolt or stud retains no elastic ability, it can’t do its job in terms of providing clamping force. If severely overtightened, a bolt can shear.

Bolt or stud diameters are based on the load required for component clamping performance. That’s why 1/4-inch bolts may be used in one location and 3/8-inch bolts in another. A smaller-diameter bolt requires less torque value to achieve ideal clamping load, and a larger-diameter bolt requires more torque value to achieve ideal clamping load. Although not a perfect analogy, you can sort of view threaded fasteners as “fuses.” The diameter is based on the requirement for the specific job, just as the amp rating of a fuse is based on the requirement for a particular circuit.

Taking advantage of a threaded fastener’s clamping load potential isn’t a matter of guesswork. Especially for critical fasteners, such as any involved in the brake system, steering system, suspension, engine, transmission, differential, and wheels, all threaded fasteners must be tightened to their specific-application torque value. If you don’t pay attention to torque values, it’s like buying a set of pistons and sticking them into cylinder bores without measuring oil clearance.

In addition to adjusting the setting and/or monitoring the preset level via a click, listening for a beep, or watching a dial, consider the variables. First of all, is the torque wrench accurate? If it’s a cheap one, or if it’s been lying around the shop for years, it may be out of calibration. Second, is the fastener being tightened lubricated properly? Third, is the clamping force being created suitable for the diameter and type of metal?

About 90 percent of the torque applied during tightening is used to overcome friction. Friction occurs between mating threads, as well as between the underside of the bolt head (or nut) and the parent material of the object being installed.

Excess friction can occur if galling or “thread seizing” takes place. This is especially common with threaded fasteners made of alloys such as aluminum, stainless steel, and titanium. If galling occurs (at any level of severity), this makes your torque readings inaccurate, since the galling effect will add significant friction at the thread mating area, which results in a severely undertightened fastener.

Several factors can affect fastener tension, including type of material, material hardness, lubrication (or lack thereof), fastener hardness, surface finish/plating, thread fit, and tightening speed.

• Make sure the threads (both male and female) are clean.

• Make sure the threads are in good condition, and free of deformation or burrs.

• When necessary, apply the required lubricant to the threads before assembly (this may involve engine oil, molybdenum disulphide, an anti-seize compound, or an anaerobic thread-locking compound, depending on the situation).

• Keep your torque wrenches clean and calibrated. Depending on their amount of use, consider sending your torque wrenches out for recalibration once per year. Also, store your torque wrenches in a safe place. Don’t toss them around the shop. They’re delicate instruments.

• When tightening, whether using a common hand wrench or a torque wrench, slow down! The action of tightening quickly can increase friction and heat at the thread area, which can lead to thread galling. Speedy tightening can also lead to inaccurate tightening, as the torque wrench must overcome the increased friction.

• When you reach the torque limit (your desired torque value), approach this slowly and watch the needle or feel for the click or vibration (depending on the style of torque wrench). If you tighten too fast, you may pull the wrench past the preset limit (unknowingly adding a few more foot-pounds of torque).

The use of a torque wrench or other torque value or fastener stretch monitoring device is absolutely necessary for anything engine related (not only cylinder heads and connecting rods, but intake manifolds, carburetors, water pumps, oil pumps, rear seal housings, oil pans, timing covers, valvecovers, exhaust manifolds or headers, power steering pump brackets, etc.). If you own a torque wrench, take the time to look up the torque value for the carburetor fasteners, and adhere to the specs. Tightening anything on the engine by “feel” is more often than not the root cause for annoying fluid or vacuum leaks. The skilled engine builder/assembler never guesses about anything. Undertightening or overtightening can and does lead to problems, ranging from the mildly annoying to the most severe. Don’t guess!

Adapters and Extensions

As long as the adapter (socket extension, etc.) is in-line with the torque wrench drive, no compensation is required. However, if an adapter that effectively lengthens the wrench is used (such as a crow’s-foot wrench), a calculation must be made to achieve the desired torque value.

For those occasions when a straight socket can’t be used, a special attachment might be needed (such as a crow’s foot). The use of an offset adapter changes the calibration of the torque wrench, which makes it necessary to calculate the correct torque settings. Following are two formulas for calculating this change:

TW (where the adapter makes the wrench longer) = L ÷ L + E × Desired TE

TW (where the adapter makes the wrench shorter = L ÷ L - E × Desired TE)

Where:

E = Effective length of extension, measured along the centerline of the torque wrench

L = Lever length of the wrench (from center of the wrench drive to the center of the adapter’s grip area)

TW = Torque setting on the torque wrench

TE = Torque applied by the extension to the fastener

If you want to know where to set the torque wrench when using an adapter that alters the effective length of the wrench, you must calculate to compensate for the adapter. If the distance from the wrench drive to the center of the bolt makes the wrench longer, the final wrench setting must be adjusted to a lower value to compensate. If the distance from the wrench drive to the bolt center makes the wrench shorter, the wrench must be set to a higher value to compensate.

Let’s say that you want to torque a bolt to 40 ft-lbs, but you’re using a 2-inch-long wrench extension. For the sake of example, the length of the torque wrench is 14 inches (from center of the handle to center of the drive). Adding the 2-inch wrench extension makes the total length (center of grip to bolt-engaging wrench) 16 inches. In this case, you divide the length of the torque wrench (L, from the center of the handle to the center of the drive) by L+E, then multiply that ratio by the desired value.

In this example, the formula works out to: 14 ÷ 16 × 40 = .875 × 40 = 35.

In this example, where you want to tighten at 40 ft-lbs, using a 2-inch extension, you set the torque wrench at 35 ft-lbs.

It’s important to orient the wrench extension in-line with the torque wrench itself, to achieve a straight shot from the torque wrench body to the extension. If you angle the extension off-parallel to the torque wrench, this inaccurately affects the applied torque.

If, due to required access of the bolt, the wrench extension needs to be placed 180 degrees (still in line with the torque wrench but effectively making the total wrench reach shorter), and you still want to achieve 40 ft-lbs of torque, you need to compensate for this shorter distance my modifying the formula as TW = L ÷ L - E, × DESIRED TE.

If the torque wrench length (center of grip to center of head) is 14 inches, and the wrench extension is 2 inches long, with the extension effectively reducing grip-to-bolt engagement distance, and you still want to achieve 40 ft-lbs of torque, the new formula is as follows:


For areas that are difficult to reach that prevent you from using only a socket on the torque wrench, an offset wrench extension may be used. When using an extension that effectively increases the length of the torque wrench, you must adjust the applied torque to compensate. Otherwise you overtighten beyond the desired torque value. The length of the extension must be factored to properly adjust your torque. A wrench extension is marked for its length.

When using a wrench extension on your torque wrench, always keep the extension parallel to the torque wrench body. If it is not parallel/straight, it results in an inaccurately applied torque.

14 ÷ 14 - 2 × 40 = 46.6

Because the leverage of the setup has decreased, to achieve 40 ft-lbs, you adjust the torque wrench at 46.6 ft-lbs.

• If the adapter makes the wrench longer, you must back off on the torque wrench setting.

• If the adapter makes the wrench shorter, you must increase the adjustment on the torque wrench.

Special Torque/Angle Torque Wrenches

Thanks to advancements in technology, torque wrenches are now available that allow you to achieve both torque value and applied rotation angle without the need for a separate angle gauge.

One example is Snap-on’s Techangle series of wrenches. Featuring sensor electronics and digital control and readout, you can preset (program) the torque value you want; or both torque value and final applied angle, depending on your needs. The electronic control allows you to also choose between in-lb, ft-lb, or Nm, plus angle.

An internal gyroscope provides the desired angle sensing.

Here’s how it works: you program the desired torque value, and tighten the fastener. When you reach the programmed torque value, the wrench beeps and vibrates.

Then, if you need additional angle rotation, you program the desired angle.

When you continue to apply pressure and reach the programmed angle, the wrench again beeps and vibrates.

The preset angle range is 5 to 360 degrees, with a resolution of 1 degree and accuracy of +/- 1 degree. Unlike the use of a separate angle gauge, where you can’t ratchet (with an angle gauge you must start and continue the angle rotation in a steady, one-direction sweep), this tool allows you to ratchet without “losing” the angle memory. Pretty cool.

Two models are currently available, including ATECH2FR100, with a torque range of 5 to 100 ft-lbs; and ATECH3FR250, with a torque range of 12.5 to 250 ft-lbs.

Cam Lobe Checker

Prior to the installation of lifters and pushrods, a camshaft lobe gauge can be used to accurately determine the position of an individual camshaft lobe (for instance, when positioning a cam with a specific intake or exhaust lobe at its base circle, or determining a lobe at its peak, or measuring a lobe from base circle to peak to verify the lobes against published lift specs). This tool features an aluminum tubular body with an adjustable plastic section that expands to lock the body into the lifter bore (adjustable for various lifter bore diameters). At the lower tip is a spring loaded, rounded plastic plunger that contacts the cam lobe. At the top is a gauge. Rotate the camshaft until the lowest reading is obtained (with the cam lobe on its base circle). Zero the gauge needle. As you rotate the camshaft, the gauge shows the lobe ramp and maxes out at peak lift. For instance, if the camshaft lift (not valve lift) is specified at .500 inch, when the lobe has reached maximum lift, the gauge shows the existing amount of lift. You then compare this to published lobe lift. This is a very easy tool to use and offers several applications when you need to monitor the position or base-to-peak travel.


The indicator is zeroed after the camshaft is rotated so that the gauge plunger contacts the camshaft base circle. This gauge allows you to not only locate the base circle, but to measure the cam lobe ramp and peak as the camshaft is rotated. A camshaft lobe checker gauge features an aluminum body, a dial indicator, and a spring-loaded plunger. The plunger is a plastic composite material, placed directly onto a camshaft lobe. The gauge body features an expandable collar that allows you to rotate the body to lock the tool into the lifter bore.

Conversions

Metric

1 Newton Meter = .741 ft-lb

1 Newton Meter = 8.892 in-lb

1 mKg = 7.25 ft-lb

1 cmKg = .870 in-lb

1 nKg = 9.8 Newton Meter

1 Meter = 100 centimeters

1 Meter = 39.37 inches

1 Meter = 3.2808 feet

1 Kilogram = 1,000 grams

1 Kilogram = 2.2046 pounds

1 Newton = .2258 pounds

English

1 in-lb =1.15 cm.Kg

1 ft-lb = 1.35 Newton Meter

1 in-oz = 28.35 in. gram

1 in-lb = 16 in-oz

1 ft-lb = 12 in-lb

1 foot = 12 inches

1 pound = 16 ounces

1 pound = 453.59 grams

Automotive Machining

Подняться наверх