Federación de Cooperativas de Cuerno Verde

A Post-Capitalist Fossil Free Co-Created Local Federation of Worker Owner Cooperatives

(glossary at the bottom of post)

The ecological footprint1 of all humanity now needs the nature of almost two Earths. If everyone on the planet live like we do in the US, we would chew through our natural resources at a rate more like five Earths per humanity2. Ecologists call this impossible symbiosis of people and nature Ecological Overshoot3.

It is important to understand that Climate Change is just one symptom of this overshoot. Generally taken, this overshoot is about too many people exponentially growing in population while consuming too many of our planet’s natural resources and leaving behind too much toxic waste and destroyed habitat without showing the love and care or humility owed toward all living things – including our own species – on Earth. There is a growing chorus of experts and scientists that say we are rapidly headed toward an unthinkable destination if we keep to this path. Yet, on we march.

There is no basis for any expectations that our political elites will lead us through4 or some imagined invisible hand of Capitalism5 can lift us out of overshoot. They wouldn’t if they could. The Oligarchs of capital swill from the stills of profits: increasingly out of touch, befuddled and incoherent. The political classes are as drunk on potent brews of corruption. We, their supporters, employees and voters, are their easily played, easily disposed codependents when, in reality, our obedience and abeyance to their ideas and ideals is plainly delusional and certainly terminal.

The Green New Deal6 is little more than a greenwashed7 genocidal election season profit scam that perpetuates their Business as Usual8.. It does less than nothing to resolve the real existential dilemmas of overshoot9. It is absurd to believe we can economically grow our way out of overshoot by producing unimaginable volumes of carbon intensive solar panels and wind turbines. I mean, Don’t Look Up10 but unless we rapidly and massively reduce our collective ecological footprint to far less than one Earth without delay, the future is 95% ± 5% doubtful.

The best actions we can take – for the youngest alive today and for their children’s children – is to immediately cease all uses of fossil carbon, desist at once from all pursuit of economic growth, abolish wealth, begin decreasing our population and, with all our care and love, properly inform every human walking this planet about it’s true carrying capacity and together do all that must now be done – and undone. And, for the sake of the next seven generations, we have to do all this without degaussing our moral compass. Either we begin to cooperatively and fastidiously toil manually for the changes needed else we plunge into hell on Earth11.

People have basic physical needs12 that include food, water, and some combination of clothing and shelter depending on the location and seasonal conditions. Through most of human history these needs were met by what was available nearby. As human populations grew and groups moved into new territory that juxtaposition began to slowly change. Prior to the Industrial Revolution the food products that could be easily transported and that would not go bad during the journey were relatively few and, for the most part, too expensive for commoners anyway. As ships became coal fired and steam powered that changed in a hurry.

Over the last two hundred years the fiat costs of our needs, relative to personal resources and income, have dropped substantially and the assortment of luxuries we have come to consider necessary are through the roof. In part this conspicuous transformation13 was achieved through the abuse of nature, methods of human exploitation, with Industrial Social Revolution and by advances of fossil energy exploitation. Clever middle-men found higher profits by reaching farther and farther into the abyss to build human chains to exploit on the back of cheap fossil energy. So called supply chains flourished that enabled merchants with capital to exploit producers who were , in turn, exploiting other humans – slaves, slave traders, farmers, manufactures, etc. – or exploiting fossil energy – as used by manufactures, shippers and distribution storage warehouses – without being directly involved, and so, not culpable in the primary exploits. More recently, ever in search of greater profits, the merchants began to acquire and consolidate resources in order to further optimized, exclusively to the merchants’ benefit, the profits wrought from the primary exploiters. By now, as we know, family farmers have all but vanished, manufacturers have relocated afar for lower labor costs, and there are only a handful of shippers still in operation but they carry more goods than ever as product storage warehouses have learned to operate on Just-in-time principals and the consumption frenzy is a-whirl.

At the same time, fewer and fewer people every day are willing to be so unfairly exploited; the finite supply of fossil energy is near it’s end and Jobs pay less and offer less rewarding work every day, relatively speaking. Meanwhile, The merchant’s of capital continue to enjoy greater profits than any time in the past and consumption grows more conspicuous every day. Something has got to give. And it just may be life itself.

It is needlessly so, because most communities have the potential to satisfy these basic needs from within. Developing that potential for self-sustainability into a proven and ongoing reality now will be critical to a transition toward survival when the unavoidable moment arrives when the fossil greased supply chains of plenty screech to a halt. Establishing community scoped fully decarbonized local supply chains for food, shelter and clothing now are crucial not only for future generations but also to end the homelessness, hype-partisanship, hate and hunger already spawned by the merchants of capital – cum Machiavellian or otherwise anti-social elites – now in full and unabashed control of the social and economic systems. Preparedness to sustain your community is vital for a just transition to what-ever comes next.

It goes only to our demise to not develop local self-reliance now. Even as we continue to collapse, and they unavoidable will in the next decades, we cannot rely upon external supply chains. We needn’t if we build the most essential supply chains locally, completely within our community with the wisdom and insight14 of our community elders mixed with the sweat of our able bodied neighbors. A well considered federation of Worker Owner Cooperatives can replace not quite enough with abundance in lieu of the profits extracted at each link in today’s long and well oiled supply chains15. Even if we lose some variety of choice – oranges could become scarce in temperate regions and broccoli may be hard to come by in the Tropics for examples, so healthy menu planning will be as important as always – working families will be better off16. If everyone has adequate food and clothing and are housed regardless of their income and pitch in to the degree they are able how could it be any better? This is far more of a good life than we have had as a species at any time up to now because we can co-create it.

A growing number of US states, including Colorado in 2011 with SB11-191, have adopted the Uniform Limited Cooperative Association Act17 that supports the formation of Cooperatives that we can use now to lawfully function in the current economic system on the sustainable foundation of the triple bottom line – people, planet and profit18 – and can begin at once to plan for a just transition to a post-capitalism economic system we can expect to emerge once the oil spigot is shut – as it must be very soon if there is to be a future. Organizing under the Act allows us to democratically define our rules and operate in accordance with the purpose, values of the International Cooperative Alliance19 and the Principles they recognize, somewhat modernized from cooperative principles first put forth by the Rochdale Society of Equitable Pioneers – 10 weavers and 20 other folks in England – in 1844:

Definition of a Cooperative

A cooperative is an autonomous association of persons united voluntarily to meet their common economic, social and cultural needs and aspirations through a jointly-owned and democratically-controlled enterprise.

Cooperative values

Cooperatives are based on the values of self-help, self-responsibility, democracy, equality, equity, and solidarity. In the tradition of their founders, cooperative members believe in the ethical values of honesty, openness, social responsibility and caring for others.

Cooperative Principles

The cooperative principles are guidelines by which cooperatives put their values into practice.

1. Voluntary and Open Membership

Cooperatives are voluntary organisations, open to all persons able to use their services and willing to accept the responsibilities of membership, without gender, social, racial, political or religious discrimination.

2. Democratic Member Control

Cooperatives are democratic organisations controlled by their members, who actively participate in setting their policies and making decisions. Men and women serving as elected representatives are accountable to the membership. In primary cooperatives members have equal voting rights (one member, one vote) and cooperatives at other levels are also organised in a democratic manner.

3. Member Economic Participation

Members contribute equitably to, and democratically control, the capital of their cooperative. At least part of that capital is usually the common property of the cooperative. Members usually receive limited compensation, if any, on capital subscribed as a condition of membership. Members allocate surpluses for any or all of the following purposes: developing their cooperative, possibly by setting up reserves, part of which at least would be indivisible; benefiting members in proportion to their transactions with the cooperative; and supporting other activities approved by the membership.

4. Autonomy and Independence

Cooperatives are autonomous, self-help organisations controlled by their members. If they enter into agreements with other organisations, including governments, or raise capital from external sources, they do so on terms that ensure democratic control by their members and maintain their cooperative autonomy.

5. Education, Training, and Information

Cooperatives provide education and training for their members, elected representatives, managers, and employees so they can contribute effectively to the development of their co-operatives. They inform the general public – particularly young people and opinion leaders – about the nature and benefits of co-operation.

6. Cooperation among Cooperatives

Cooperatives serve their members most effectively and strengthen the cooperative movement by working together through local, national, regional and international structures.

7. Concern for Community

Cooperatives work for the sustainable development of their communities through policies approved by their members.

Hearkening back to the peasantry lifestyle of the middle ages in Europe – and as easily to most any indigenous culture elsewhere in the world then and today – we require local supply chains for our basic needs with house holds at their core. In support of household gardens we need to be able to assure everyone a year-round access to adequate food – and clean drinking water – within walking distance of every home in each neighborhood. We must produce and preserve as much of that food locally, in each neighborhood, as is humanly possible if we are to eliminate dependence upon fossils for production and transportation of food supplies.

Sooner or later we will also require a local – neighborhood, lets call it – capability to produce household woven textiles, clothing, basketry, clay pottery and metal utensils as well as draft animal and pedal powered local transportation systems.

In addition, and absolutely critical to our success, we must assure that all are housed. A person without a reliable shelter for cooking and sleep is person without enough. Our goals of equity must include the abundance of enough else we are divided and, thus, we will fail. When considered wisely it can be seen that the most effective, efficient, and affordable way to do this is by building single story adobe and other earthen housing using the local soils with naturally efficient biomass fueled masonry heat sources, natural water conservation systems, natural ventilation and cooling systems and natural regeneration systems for household waste.

The enterprises that can meet these needs will have to be made up entirely of free-willed local workers using primarily locally sourced materials and may ultimately be rewarded with the abundance of enough to eat even should there not be work paid in fiat wages to be had. Most likely we will have to get things going with payment in coin. We will be best prepared to anticipate that the the secondary systems of exchange – what ever the fiat currency – will collapse when the fossil use is ended. Though, perhaps we might still be able to barter with neighboring communities on an ongoing basis? Current dependencies on fossil powered industrial processes will change significantly and may not be possible. The skills, wisdom and insight of all will need to be smart and keen if we are to live as a community and with prosperity. While, ultimately, the workers of each cooperative – and in turn by the federations of cooperatives – will determine democratically what they will do and how they will do that, the variety of cooperative enterprises in which people might choose to be involved could include, nut are not limited to the following supply chain scales.

Local food scale:

  • Help neighbors to grow no-dig gardens that sustain households, build soil & attract pollinators
  • Farm community food forests that produce fruits, grains, bast fibers, vegetables, eggs, etc.
  • Forage edible & medicinal plants, and hunt, if we must, always without lasting harm to nature
  • Grow foods in neighborhood all-season greenhouses & seasonal neighborhood market gardens
  • Build soil regenerating compost infrastructures that warm greenhouses, intake household & garden wastes then distribute the finished compost to all gardens within the neighborhood
  • Operate neighborhood root cellars under shared winter ice assisted solar cold storage lockers
  • Pack, can, dry, ferment and grind foods for later distribution at neighborhood kitchens

Local textile, fiber and household items scale:

  • Produce hemp, flax, cotton and wool in permaculture and neighborhood gardens
  • Forage strong plant fibers (e.g., yucca, milkweed, dogbane, etc.) without harm to nature
  • Hunt when can be done without harm to nature
  • Hand spin and hand weave textiles, baskets, rope & shoes from fibers we grow, gather & forage
  • Make pottery & containers from clay and glass using local soils/sands and recycled glass
  • Design, tailor & hand or treadle stitch clothing & other goods from the fibers we spin & weave

Local scale within technical services and support :

  • Bio-mass Blacksmiths, bio-char makers, metal workers, wainwrights, toolmakers, etc.
  • Construction
    • Raise earthen passive solar homes, insulated with limed hurd from hemp we grow
    • Erect Neighborhood and household passively heated greenhouses
    • Re-purposing & recycle locally available man made building materials
  • Household Mechanical
    • Trombe Wall passive solar heating systems
    • Bio-mass Masonry Mass Stoves & rocket heaters builders
    • Passive solar cook tops and ovens
    • Passive evaporative cooling20 and ventilation
    • Compost toilets manually integrated with household composting systems
    • Winter ice cold storage & root cellars at Neighborhood facilities
    • Compost & passive solar mass neighborhood greenhouse warming systems
    • Household 3 & 4 season Walipini sunken greenhouses
  • Plumbing
    • household and neighborhood facility rainwater catchment
    • river basin gravity water supply ditches
    • Josh Kearns style bio-char potable water purifiers21 at neighborhood facilities
    • rooftop, ground mount & portable passive solar water warming vessels
    • flush toilets drained to household cat-tail lagoons staged into garden water systems
    • hydrophilic plant based gray to garden water purification systems

This enumeration of Cooperative Enterprises is neither fixed, mandatory nor, necessarily, exhaustive. It should, however be adequate to get the conversation started about what we will need. Future blog posts will explore these local supply chains. I welcome all feedback offering improvements, corrections or volunteers for participation in a foundational (vis-a-vis unpaid volunteer) steering committee for a federation in your community. I am interested to participate in such a steering committee in Pueblo Colorado, for example but am clueless about any others that might wish to participate here. If you want to run with this idea in any other location, I urge you to go right ahead with urgency, even if you don’t let me know about what you have going on. I would suggest that it could be helpful for you to let me know because I then could help others that stop by here from your location to find you. I am old and likely cannot contribute in any other way to be honest. I just want to see things get off to a good start. From there it will be in your competent hands.

Footnoted References

1 Wackernagel, Michael & William Rees, Our Ecological Footprint: Reducing Human Impact on the Earth, 1996, New Society Publishers.

2 U.S. Environmental Footprint Factsheet, 2021, Center for Sustainable Systems, University of Michigan, published online

3 Catton, William (1980) Overshoot: The Ecological Basis of Revolutionary Change, Illini Books

4 Greve, Joan E. (Feb 22, 2022) Money unites: Republicans and Democrats find rare bipartisanship over trading stocks, The Guardian, published online

5 Smith, Adam (1790) The Theory of Moral Sentiment, public domain, reprinted online

6 HR19-109 (2019) Recognizing the duty of the Federal Government to create a Green New Deal, United State Congress

7 Harsany, David (Feb 27, 2019) The 10 Most Insane Requirements Of The Green New Deal, The Federalist, published online

8 Macy, Joanna and Molly Brown (2014) Coming Back to Life, New Society Publishers, Chapter 1

9 Jensen, Derrick, Lierre Kieth and Max Wilbert (2021) Bright Green Lies, Monkfish Book Publishing

10 McCay, Adam, Director, story by David Sirota, (2021) Don’t Look Up, Netflix

11 Wallace-Wells, David (2019) The Uninhabitable Earth: Life After Warming, Tim Duggin Books, Crown Publishing

12 Maslow, Abraham (1943) A Theory of Human Motivation, Psychological Review, v. 50 n. 4, pp. 370-396

13 Veblin, Thorstein (1899) The Theory of the Leisure Class: An Economic Study in the Evolution of Institutions , The Macmillan Company, public domain

14 Macy, Joanna op. cit., Chapter 4

15 Scanlan, Melissa (2021) Prosperity in the Fossil-Free Economy, Yale University Press

16 Jones, Patrick and Meg Ulman (2020) Replacing growth with belonging economies, Artist as Family Channel, YouTube.com, posted online

17 Uniform Limited Cooperative Association Act (2007) National Conference of Commissioners of Uniform State Laws

18 Elkington, John (1999) Cannibals With Forks : Triple Bottom Line of 21st Century Business, John Wiley & Son Ltd

19 Guidance Notes of the Cooperative Principles, of the International Cooperative Alliance, (2015) published online

20 Ford, Brian (2001) Passive Downdraught Evaporative Cooling Principles and Practice, Environmental Design, Cambridge University, V. 5 n.3

21 Kearns, Josh (uncertain) A Field Guide to Biochar Water Treatement, Substack.com, active in 2022, published online

Glossary

Adobe – A soil based and somewhat carbon sequestering structural building material ideally consisting of 4 parts water, 7 parts clay rich subsoil and 7 parts washed sand. Adobe may be used in building construction in bagged (see superadobe and hyperadobe), brick, block, slab and rammed earth mass forms. Roughly 3”-6” (8-15cm) and less lengths of straw or other high tinsel plant fibers are most suited in adobe block formulations. Bagged adobe is most effective when free of organic material while Adobe block and slurry are strengthened when well combined with 4 parts chopped/mulched straw before molding or pouring into place. Lime might also be added as a binder. Less sustainable adobe mixtures might alternately include a concrete binder. All types of adobe walls are usually finished with a layer of cob or lime plaster on all exposed surfaces.

Alchemy – A Philosophy of nature and protoscientific methodologies that seek to holistically purify, mature, and perfect materials or practices. The concept probably originated in Egypt where the 22 Symbols of Alchemy have been observed in hieroglyphs although it has been practices throughout Europe, Asia and Africa in a variety of incantations for a few thousand years.

Climate Change – Human activities are causing an ongoing alterations in the meteorological conditions, including temperature, precipitation, and wind, that characteristically prevail in every region of the Earth. This is driven most significantly by the application of distillations and combustion of extremely dense fossil carbon deposits extracted from the Earth’s crust into the biosphere. Atmospheric carbon emissions subsequently insulate or restrict the heat emitted at the Earth’s surface from escaping into space. 93% of this excess trapped heat has thus far been absorbed by the oceans that are at the same time being acidified with fossil carbon laden effluents containing high levels of distillates drained from human activity locations. A consequent non-linear dilution of the oceanic ecosystems by rapidly increasing volumes of non-saline glacial melt water owed to increase surface temperatures is put into process. This total onslaught then slowly interrupts the climate driven oceanic processes of current and overturning oscillation prevalent since at least the onset of the Pleistocene Epoch about 5 million years ago. The remaining 7% of the heat is entrapped in the troposphere (lowest levels) of the atmosphere, thus raising ambient global air temperatures. Ocean temperatures have continued to increase, as have average ambient air temperatures and atmospheric carbon concentrations. The air temperature increases are now, on average, in excess of one degree Celsius since the beginning of the Industrial Revolution in the 18th Century AD, less than 300 years ago. It is projected by the United Nation’s International Panel on Climate Change (IPCC) that average air temperatures is likely to reach 2.4º C by 2050 unless humans reduce all transfers of carbon into the atmosphere by 10% per year reaching zero transfer by 2030 and employ currently unproven technologies to re-sequester the atmospheric carbon back into the Earths crust. It is worth noting that the recent climate change to date has continuously happened at a faster rate and to a higher degree than previous IPCC calculations had projected. Without exception, the IPCC has revised the anticipated thermal rate of increase and projected peak values upwards at each Conference of the Parties (COPs) since the first in 1995.

Cob – A natural carbon sequestering building material made from subsoil, water, and coarsely mulched straw or other high tinsel plant fiber. The base formula for a Cob mixture is 2 parts sand and 1 part clay rich soil but varies with the clay content of the soil used. The clay soil should be slaked (soaked) overnight and the excess water poured off before mixing with sand and then adding as much fiber as possible without making it too difficult to apply the Cob. Cob fibers are mulched to longer lengths than fibers used in adobe mixtures. The Universal Building Code (UBC), as currently applicable in Colorado, also refers to Cob as “unburned clay masonry”. According to the wikipedia.org entry for cob, cob is a synonym for adobe. That may not be quite correct, but they are very similar in composition. See daub for a better clarity of the difference.

Co-Creative – A metaphysical notion about manifesting desires, meeting goals, and planning out the path into the futures in community. In our context – perhaps in any context – co-creation implies a process that embodies participatory democracy just as participatory democracy necessarily invokes a co-creative commitment.

Codependent – Of or relating to a relationship in which one person is psychologically dependent in an unhealthy way on someone who is addicted to a substance or self-destructive behavior, such as a gambler with her heroine addicted bookie, a politician with a Capitalist or a battered spouse with their abusive mate.

Cuerno Verde – Cuerno Verde is a Spanish name applied to the highest peak of the Wet Mountains that arise between the buffalo grass prairies stretching East along the Arkansas River from Pueblo, Colorado, traditional Muache-Ute and Uncompagre-Ute hunting grounds, and the head waters of the Rio Grande River in the San Luis Valley surrounding Alamosa, Colorado, the historical homelands of the Uncompagre-Ute. In the San Luis Valley industrial hemp has been farmed since it’s resent re-legalization in 2012. Cuerno Verde translates literally as “Green Horn” in English and is usually taken as a somewhat derogatory term used to describe a person with lofty aspirations and stubborn determination yet little practical experience or training with whatever must be done to realize their aspirations.

Daub – The binding and/or filler material of various wattle and daub construction techniques in use since at least the Neolithic period (10,000–4,500 BCE) of the human Stone Age. The wattle is generally a superficial structurally underlain component for these techniques such as interwoven sticks, adobe blocks, hardware fabric or straw bales. The daub most often being a component applied while wet onto the wattle to form a weather resistant weather protection between main structural beams, blocks, posts, poles, boulders, etc of a building or shelter constructed from locally available components. Common daubs include cob, concrete mortar and lime plaster. A daub might also serve as the structural component, the wattle, in some wattle and daub structures such as with adobe block, hyperadobe, superadobe, earth bag or igloo construction.

Decortication – The technique for separation of the bast fibers from the woody pith of properly retted, or otherwise cured and/or dried stems and stalks of certain fibrous plants such as hemp, or flax.

Ecological – Of or pertaining to ecology – the study of the interrelationships of any and all forms of life and their environment. Often also used to describe something that is not harmful to the environment. Frequently weaponized by market capitalists for its green-washing effect.

Ecological Footprint – Coined by Mathis Wackernagel and William Rees in their book Our Ecological Footprint: Reducing Human Impact on the Earth (New Catalyst Books, 1996). “measures how much nature we have and how much nature we use” – (from the Wackernagel founded Global Footprint Network’s website https:/www.footprintnetwork.org). An Ecological Footprint is generally expressed as the number of Earth’s that would be needed to sustainably support an individual’s or a particular group of humanity’s (e.g., faculty, staff and students of a post secondary institution, a geographic segment, all people on the planet, etc.) impact on nature and natural resources.

Ecological Overshoot – see Overshoot below

Egalitarian – Affirming, promoting, or characterized by belief in equal political, economic, social, and civil rights for all people. Egalitarian is useful because it implies the equities of feminism, racial justice and LGBTQ justice and climate justice without tokenism that might not be understood if the focus was shined upon one particular scope of inequity of Industrial Society. Gardeners who feel a particular equality is most important or appropriate to their life are free to use what ever term[s] they feel more appropriate than the uber inclusiveness implied by egalitarian.

Fossil Free – Without the use of fossil fuels or energy intensive process such as mining and industrial manufacturing processes

Hemp – Industrial Hemp as defined under Title 35 Article 61 of the Colorado Revised Statutes and verified by testing at the Colorado Department of Agriculture:

“Industrial hemp means the plant species

Cannabis sativa L. and any part of the plant, whether growing or not, containing a delta-9 tetrahydrocannabinol (THC) concentration of no more than three-tenths of one percent (0.3%) on a dry weight basis.” – https://ag.colorado.gov/plants/industrial-hemp (11/26/2021)

Hempcrete – a Lime, Hemp Hurd and water mixture – perhaps combined with other natural materials – most useful as a carbon sequestering non-structural and quite flame resistant insulating material. Concrete is not used in hempcrete.

Hemp Hurd – The inner or woody fiber core of hemp stems and stalks other than the outer high strength fiber layers better used for cordage.

Hyperadobe – A raschel mesh bag encased soil layered adobe building style that was developed by Fernando Pacheco of EcoOca in Brazil beginning about 2006. Hyperadobe is ideally about 70% sand and 30% clay, but this can vary. The raschel mess bag allows the soil of the layers bond without the addition of barbed wire between layers as is required with superadobe, so is a faster and more economical earth bag method.

Food Forest – A food forest, also called a forest garden, is a diverse planting of edible plants that attempts to mimic the ecosystems and patterns found in nature: a style of permaculture. Food forests are three dimensional permaculture designs, with life extending in all directions – up, down, and out.

Generally, we recognize seven layers of a forest garden – the over-story, the under-story, the shrub layer, the herbaceous layer, the root layer, the ground cover layer, and the vine layer. Some also like to recognize the mycelial layer, layer eight (mushrooms). Using these layers, we can fit more plants in an area without causing failure due to competition. (see projectfoodforest.org)

Kachelofen – Kachelofen (plural ‘kachelöfen’ in German) is a type and a brand of masonry stove made out of specialized stove tiles and other refractory materials and originates from central Europe. Unlike brick finished or rendered stoves, kachelofen type stoves usually have higher surface temperature, depending on their design characteristics. (see, for example, stoveworks.com) More generically, the type name is used to describe a radiant mass heater in Central Europe as have been in use there for thousands of years. Kachelöfen are wood efficient and can produce radiant warmth for up to twelve hours from a single (est. 1 hour) burning of stick wood in the nominally sized burn chamber. A Kachelhofen will weigh a thousand pounds or more and may reach from floor to ceiling. Mass is important for heat conservation. They are typically fabricated from many fired clay firebrick often encased with decorative ceramic tiles. Kachelöfen generally stand on legs similar to more conventional wood stoves. Translates to English as “clay stove” or “tile stove”. (See also Masonry Mass Heater)

Kang Bed –A hard clay or brick sleeping surface originated in China built above a radiant mass heater similar to a kachelhofen laid on it’s side. The burn box of a kang bed is usually a cooking stove. This provides a nice hot cook stove for an evening meal and then captures the warmth for slow and even warmth for sleepers through the night.

Limited Cooperative Association (aka LCA) –, a particular style of Limited Liability Corporation (LLC) or unincorporated company as (see Article 80 of Title 7, Colorado Revised Statutes). Under this title, all stakeholders are considered Members and the LLC is run under a Member defined operating agreement that, if in compliance with the Article, “…governs the rights, duties, imitations, qualifications, and relations among the managers, the members, the members’ assignees and transferees, and the limited liability company.” (from 7-80-108).

An LCA is more precisely regulated by “The Uniform Limited Cooperative Association Act (ULCAA)”, and is formed after articles that substantially comply with section 7-58-303(1) and become effective under section 7-90-304 .

see https://www.sos.state.co.us/pubs/business/news/2012/20120402_ULCAA_Dean.html, from section 7-58-303, Colorado Revised Statutes:

A cooperative organization is one owned by persons who join together (1) to utilize the organization to provide themselves with goods, services or other items, (2) to have democratic control over the association, (3) to provide the basic equity financing for the association, and (4) to share in the financial benefits of the organization in accordance with their respective use of the association. It is not a “not for profit” organization because its profits are returned to its members at the end of each year in cash, evidence of equity investment, rebates or in other forms. Unlike “for profit” organizations, however, traditional cooperatives do not permit outside investment from persons who would have a vote in the governance of the cooperative.” It is up to the cooperative to decide if there will be investors and, if so, whether and of what nature the investors will have a vote.

Masonry Mass Heater – a site-built or site-assembled, solid-fueled heating device constructed mainly of masonry materials in which the heat from intermittent fires burned rapidly in its firebox is stored in its massive structure for slow release to the building. It has an interior construction consisting of a firebox and heat exchange channels built from refractory components. Specifically, a masonry heater has the following characteristics:

  • a mass of at least 800 kg. (1760 lbs.)
  • tight fitting doors that are closed during the burn cycle
  • an overall average wall thickness not exceeding 250 mm (10 in.)
  • under normal operating conditions, the external surface of the masonry heater, except immediately surrounding the fuel loading door(s), does not exceed 110 C. (230 F.)
  • the gas path through the internal heat exchange channels downstream of the firebox includes at least one 180 degree change in flow direction, usually downward, before entering the chimney
  • the length of the shortest single path from the firebox exit to the chimney entrance is at least twice the largest firebox dimension

Masonry Mass Heaters are generally built-in to the building structure.

(mha-net.org, Masonry Heater Association of North America)

(See also Kachelofen)

Metaphysics – The branch of philosophy that studies the fundamental nature of reality, the first principles of being, identity and change, space and time, causality, necessity, and possibility. It includes questions about the nature of consciousness and the relationship between mind and matter.(Wikipedia.org) The branch of philosophy that examines the nature of reality, including the relationship between mind and matter, substance and attribute, fact and value. The theoretical or first principles of a particular discipline. A priori speculation upon questions that are unanswerable to scientific observation, analysis, or experiment. (wordnik.com) The only science capable of inquiring beyond physical and human science. (Univ of Sedona.edu) The study of ultimate cause in the Universe. (metaphysics.com)

Organic Produce – grown only with the use of soils, composts, fertilizers and other soil amendments that occur in nature with no use of genetically modified seed, chemically formulated fertilizers, growth stimulants, antibiotics or pesticides and without the use of industrially slaughtered, factory farmed or genetically modified feed raised or containing chemically formulated growth-stimulants, antibiotics or pesticide laden manure or animal by-product amendments. Note:Produce labeled USDA organic are often notcompliant to this definition.

Overshoot – Coined by William Catton. The ecological condition that occurs when a population exceeds the permanent carrying capacity of it’s habitat. Catton’s work specifically explained and discussed human Ecological Overshoot as already occurring by 1980 across the entirety of the planet.

Passivhaus – an internationally recognized low energy design standard originated by physicist Wolfgang Feist that strives to achieve comfortable buildings with minimal requirements for space heating or cooling. (passivehouse.com)

Permaculture – An ethics based ecological design system first described by Biologist William Mollison and Ecologist David Holmgren in the 1970s based on observing and replicating nature. Any system of perennial agriculture emphasizing the harmonious use of renewable resources from nature that enrich the local ecosystem and at the same time produces a human harvest-able abundance of plants and animals. These agricultural systems or methods seek to integrate human activity with natural surroundings so as to create highly efficient self-sustaining local ecosystems. Specific systems of permaculture include Food Forests, Riparian and Swales.

Participatory Democracy – A participatory decision making process where each member of an eligible population has one equal vote and a majority vote, generally taken as anything over 50% plus 1 vote of the total population, will decide any issue up for consideration. Not voting becomes a NO vote by default and the decision of the majority is the de facto decision of the population. The democracy is able to establish or change any voting rules and regulations by majority decision including, for example, what constitutes a majority decision and whether or not a representative sub-group may make some decisions.

Post Capitalism – Production or productivity done in accordance with modern practice but without the primacy of a profit motive or expectation for any capital holder’s financial return, without externalization of true costs and without avoidance of personal responsibility. A recognition that capitalism is an unsustainable economic paradigm that has over-used, misused, and in other ways undone the conditions necessary for sustainability. Conditions that were acceptable prior to capitalism but lost sustainability under capitalism, driving the abstracted global human economic system rapidly out of balance and nature deeply into ecological overshoot at a non-linear pace are regarded as Capitalistic in Post-Capital paradigms.

Protoscience – Any work of scientific study that has not yet been adequately tested or is at a premature phase of scientific validation – that is, prior to data gathering and analytical conclusion – the hypothesis may not yet be capable of being proven false or even if the hypothesis is indeed false. Theories that are probably consistent with existing science or, where not consistent, offer a reasonable account of the inconsistency and presenting a plausible hypothetical outline for testing that inconsistency represent the highest forms of protoscience. In circumstances where survival adaptations might not provide for proper scientific methodologies, protoscience lowers the barriers for urgent procession far more reasonably than can be done using only reactive or overly emotional gambits (e.g., panic, pandemonium, reckless abandon, dogma, pseudoscience, etc.). Protoscience presupposes the same accurate data collection and thorough record keeping methods expected for all scientific analysis.

Regenerative Agriculture – describes any agricultural practices that focus on the health of the ecological system as a whole, not solely on high production yields of crops or mono-cultures. Permaculture is design based regenerative plant agriculture. Animals are often present and of value in regenerative systems though they are not generally raised for slaughter. Instead they might be natural lawn mowers. natural fertilizer factories that live to a ripe old age as friends of the farmers and, so, probably never eaten.

Retting – The process of steeping plant stalks in water and drying in which the inner or hurd fibers of the stalks are, by natural action from moisture and air, rendered more easily separable from the outer filament fibers without need to cut or break the outer high tensile strength filament fibers.

Riparian – the buffer zone or area between dry land and a river or stream. Riparian may describe a flood plane, estuaries or other wetland.

Rocket Stove Mass Heater

  • heat your home with 80% to 90% less wood
  • exhaust is nearly pure steam and CO2 (a little smoke at the beginning)
  • the heat from one fire can last for days
  • you can build one in a day or two
  • folks have built them spending less than $20
  • less CO2 than natural gas or electric heat
  • if you buy the wood, it costs less to operate than natural gas (richsoil.com)

A rocket stove is a simple j-shaped combustion chamber that burns stick wood cleanly and rapidly. Adapted as a mass heater with cob and other whiz-bangs to store the heat produced the rocket stove represents a more accessible and affordable mass heater than kachelöfen, kang bed or other

Masonry Mass Heaters while offering most of the benefits

Seventh Generation Principle – A principal taken from the Haudenosaunee of the five tribes of the Iroquoian Nation, the oldest know participatory democracy on Earth. Haudenosaunee literally means “People building an extended house”; more commonly stated as “People of the Long House.”. The longhouse being a metaphor introduced by the Peace Maker of the formation of this Confederacy meaning that the people are meant to live together as if many families in the same house. The Seventh Generation Principle is the philosophy that the decisions we make today should result in a sustainable world seven generations into the future. This prescient philosophy is currently somewhat overused as a green-washed marketing ploy to sell everything from dish soap to cars.

Super-adobe – an earth bag construction technique that has been in use at least since the bunkers of World War 1 and was most recently developed by architect Nader Khalili beginning around 1984 as a building construction technique. It appears that the first house using his technique was built in 1995. His work recommends a lime or cement and soil mixture packed into poly backs that are layers with barbed wire to mortar the bags together, tamped solidly in place at each layer. The technique has since also been successfully used with soil only, with no cement or lime additive.

Sustainability – the practice of using natural resources responsibly, so they can support both present and future generations. (nationalgeographic.org)

Swale – A swale is a ditch or depression dug ‘on contour’, to catch rainwater before it runs away, allowing the moisture to soak into the soil. Swale establishment is based upon the correct observations that the more water that is caught and then soaks into the soil, the less supplemental water will be needed to by the plants that grow near the swale.

Textiles – Flexible materials made by weaving interlocking bundles of cordage – threads, yarns or ropes – that have been spun to very long continuous lengths, and perhaps by re-twisting raw fibers or spun cordage into larger diameter continuous lengths. Textiles are formed by weaving, knitting, crocheting, knotting, tatting, felting, bonding, braiding, etc. cordage into sheets that can then be cut and stitched to make more useful items.

Transformative Adaptation – The process of continuous and necessary personal and societal change necessary to survive as climatic processes change. The unpredictability of the scope, degree and rates of climate change demand a continuous and ongoing process of adaptation. There is no real possibility that any single act or event of adaptation will be sufficient or enduring.

Trombe Wall – A south wall, windowed, indirect or non-ventilated passive solar mass radiant heat storage system that has been in practical use for 150 years, first patented in 1859. Trombe walls excel at maintaining a steady indoor temperature. The sun slowly warms the Trombe wall during sunlit hours, then the wall slowly radiates the stored heat into the interior space over many hours.

Walipini – A partially underground greenhouse that uses ground temperature and indirect mass storage to moderate air temperatures that enable an extended season or, in moderate winters, year-round growing environment.

Worker Owner Cooperatives – Cooperatives are generally described in Colorado Revised Statutes Title 7 Articles 55, 56, 57, 58 and 101.Worker cooperatives are more specifically introduced in Article 56 and codified as Article 58 (SB 11-191 passed 2011)

Worker Owner Cooperatives as proposed here are Tradecraft specific Cooperatives owned equally by all workers in that tradecraft. Apprentice, Journey and Master skill levels may be recognized and my carry different benefits and rewards as determined by the participatory democracy of Worker Owners, however, ownership of the Cooperative is equal among all. All Cooperatives enjoined under the charter of the Federación de Cooperativas de Cuerno Verde also become equity members of the Federation

(see also Limited Cooperative Association elsewhere in these definitions)

El pueblo unido jamás será vencido

Posted in node.js | Leave a comment

Console Tools for the SQL Server for Linux CTP2.1 Docker Image

In my ‘lab’ I have my last three laptops and a Beaglebone Black. The laptops all came with a retail Windows OS (10-8-7). All have long been running Linux (openSUSE, linuxMint and CentOS in the moment, but after 18 months I’m ready to admit openSUSE is not among my preference for desktop Linux purposes – it’s just such a pain to .). The Beaglebone is on Debian Jesse for ARM. Not a Windows box to be found other than a lingering dual-boot that hasn’t been accessed – or updated! – in more than a year now on the most recent machine.

I was easily tempted when the SQL Server for Linux hit the Internet. But playing around with the earlier CTPs without a Windows box was not very exciting. All I had was sqlcmd and bcp. Eventually I scrounged up a few more tools.

This post talks about some of the command line tools I found usable. If you have SQL Server for Linux in your future, this may be interesting and useful to you. It occurs to me that being able to query and configure a SQL Server from it’s console is a fundamental and essential requirement: just in case you cannot connect from anywhere else and your data is valuable to you.

/~ bash

The out of the box console query tools for SQL Server on Linux is the empty set. The server-side components of something M$ is calling ODBC are there, but the client side ODBC mssql-tools package that provides sqlcmd and bcp must usually be installed separately to get to your data. The official Docker SQL Server image (from CTP2) comes with mssql-tools pre-installed. (found in /opt/mssql-tools/bin inside the container).

The mssql-conf utility comes with every SQL for Linux. Whether or not the tools are installed. Indeed mssql-conf can come in handy to get things set up in the file system. It must also be good for start-up trace flag configurations and tls?

One of the first things you may want to do upon installation is look at the SQL Server errorlog suggesting cat, tail and head could also be counted among the out of the box tools at your disposal as well as what-ever text editors are available on the server (e.g. EMACS, Gedit, Nano, Kwrite, LibreOffice, Bluefish, Atom, vi, etc.) and sudo/root level access to the SQL Server files. Bluefish and Atom are IDEs with some OK support for SQL syntax giving you an ability to review and modify your library of SQL scripts with autocomplete color with formatting to highlight SQL syntax. In slight conflict with my earlier mild disdain for openSUSE’s KDE desktop, even Kwrite is not unusable as a sql editor. One side effect for authors of ad hoc queries is that you can end up flopping between apps to toggle between building the queries, running the queries and seeing the results. Even to correct a silly typo…

When using the SQL Server on Linux image from Dockerhub.com, you will want to include the Docker CLI in your tool set. The Docker CLI is used to start and stop the SQL Server Container rather than stopping and starting the SQL Server. It is still possible to stop only the SQL Server service of course, albeit a bit convoluted. You open an interactive shell inside the running container to get to the mssql-tools included in the image. All mssql-conf work for that image is done inside the running container (that is, local to the SQL instance).

> docker exec –interactive –tty <container-id> /bin/bash

The more stable and deploy-able way to create SQL Server meta-data or move data in bulk has always been by script. SQL meta-data as well as routine and repetitive tasks expressed as sqlcmd batch scripts are powerful and generally portable – from one SQL Server to the next anyway. For the most part anyway, a T-SQL script that works with SQL Server on a Windows server also works with SQL Server on Linux. Few things could so discouragingly encourage scripted work more than a sqlcmd prompt staring back at you.

With the CTP there remain a few unseen obstructions in the port and probably will always be a potential for annoying cross-platform inconsistencies. SQL Trace and Extended Event data, for example, are written to the file system at /var/opt/mssql/log by the SQL Server on Linux but are so useless from the Linux console. Extended Store Procedures (xp_) that touch the file system are generally not working (yet?). Also, when script file archives are transferred to the Linux system weird things can happen like symmetric uni-code glyphs replacing ASCII quotes causing working and tested scripts to fail upon db engine parse because of the tilt direction of the quotation marks. The fix is usually to not copy scripts in to Linux in that way that introduced the inconsistencies.

Writing ad hoc queries when debugging or researching an issue with sqlcmd can be exasperating. Especially when the query doesn’t fit nicely on one line. Sure sqlcmd understands line feeds, but once you push enter to create a line, there is no changing it. A batch that can be edited regardless the number lines in that batch would be preferable. As a compromise, use the -input (-i) sqlcmd command line switch option and a collection of scripts. As previously mentioned, an editor like Atom or Bluefish provide enhanced query author support. Note that script libraries stored remotely from the SQL Server create a remote dependency that may be better avoided for improved availability at critical times. Better to keep essential scripts at the server and maintain the master copy in a source control.

sql errorlog

When the Docker container for a SQL Server instance is created including a persisted volume – or volumes if only to project hopefulness – there are different choices for reviewing the SQL errorlog than when you need to get in the container in order to see any of SQL Server’s output files. Fortunately, everything is still in the familiar mssql/logs subdirectory by default regardless in which case you find yourself. One difference is, that you will need to add the step of opening an interactive shell to work with the files inside the container. You also need to be vigilant of the volume being unexpectedly dropped from sync with the container during some ‘maintenance’ operation or another. And in either scenario, you can also look at the Docker logs to query all the log records available to Docker Engine. Turns out the Docker log IS the collection SQL error logs all in one view.

> docker container logs <container-id>

The command includes a tail like set of options to filter the log records returned either by file position or timestamp. As mentioned previously, several tools are included with Linux that will vary somewhat by builder. I don’t think you will find one that does not include cat. The ‘official’ SQL Server is built on Ubuntu 16.04 indicating that the GNU coreutils head and tail commands are in /usr/bin. I believe that may be true for all three supported Linux flavors. You are root when you open an interactive bash shell inside a container. For a package installed SQL Server, you will likely need root or sudo rights to access programs in /usr/bin

> docker container logs --help
Usage: docker container logs [OPTIONS] CONTAINER

Fetch the logs of a container

Options:
 --details Show extra details provided to logs
 -f, --follow Follow log output
 --help Print usage
 --since string Show logs since timestamp
 --tail string Number of lines to show from the end of the logs (default "all")
 -t, --timestamps Show timestamps

Mssql-conf:

The mssql-conf utility exposes settings comparable to editable settings in the service start-up information of a SQL Server instance running on Windows. Various database file locations and properties can be set and SQL Server trace flags can be enabled/disabled. mssql-conf is a work in progress and documented with the most recently released changes at https://docs.microsoft.com/en-us/sql/linux/sql-server-linux-configure-mssql-conf.

root@63fe05e40911:/# /opt/mssql/bin/mssql-conf
 usage: mssql-conf [-h] [-n]  ...

positional arguments:

setup Initialize and setup Microsoft SQL Server
 set Set the value of a setting
 unset Unset the value of a setting
 list List the supported settings
 traceflag Enable/disable one or more traceflags
 set-sa-password Set the system administrator (SA) password
 set-collation Set the collation of system databases
 validate Validate the configuration file

optional arguments:
 -h, --help show this help message and exit
 -n, --noprompt Does not prompt the user and uses environment variables only
 defaults.
root@63fe05e40911:/# mssql-conf list   
 network.tcpport TCP port for incoming connections
 network.ipaddress IP address for incoming connections
 filelocation.defaultbackupdir Default directory for backup files
 filelocation.defaultdumpdir Default directory for crash dump files
 traceflag.traceflag Trace flag settings
 filelocation.defaultlogdir Default directory for error log files
 filelocation.defaultdatadir Default directory for data files
 hadr.hadrenabled Allow SQL Server to use availability group...
 coredump.coredumptype Core dump type to capture: mini, miniplus,...
 coredump.captureminiandfull Capture both mini and full core dumps
 network.forceencryption Force encryption of incoming client connec...
 network.tlscert Path to certificate file for encrypting in...
 network.tlskey Path to private key file for encrypting in...

network.tlsprotocols TLS protocol versions allowed for encrypte...
 network.tlsciphers TLS ciphers allowed for encrypted incoming...
 language.lcid Locale identifier for SQL Server to use (e...
 sqlagent.errorlogginglevel 1=Errors, 2=Warnings, 4=Info
 sqlagent.errorlogfile SQL Agent log file path
 sqlagent.databasemailprofile SQL Agent Database Mail profile name

When using the Docker Image, use the Docker interactive prompt command previously described to get to the utility from a shell on the host. When the mssql volume is persisted on the Docker host, the mssql.conf configuration file will be viewable directly from the host, but access to the utility to make any changes must be done using the mssql-conf tools found only within the container.

NPM

The name ‘mssql’ is somewhat ambiguous. It is the name of the sub-folder where SQL errorlogs, data files and the mssql.conf configuration settings file are located, (/var/opt/mssql) as well as the /opt/mssql sub-folder where the sqlserver executable lives, This latter folder also includes a library of python files that comprise mssql-conf. Other than to mention them now to help compare the Linux and Windows contexts, none of these are what is meant in this post by references to mssql. Instead our references apply specifically to the open source Microsoft maintained node-mssql npm driver for SQL Server (https://github.com/patriksimek/node-mssql) that uses the Microsoft maintained tedious javascript TDS implementation (https://tediousjs.github.io/tedious/) from a Linux client machine to query any SQL Server.

mssql is the package name at npmjs.org and in common usage as well, but the package source follows a deprecated naming convention with a ‘node’ prefix for the github.com repository name. The truth is, the user never has to explicitly invoke node.js when this package is installed globally and used at the bash prompt. I acts pretty much like any other command-line binary.

node-mssql

When the MIT licensed mssql npm package is installed globally and an appropriate /.mssql.json’ configuration file has been created, queries can be piped through tedious to the SQL Server directly – if a bit awkwardly – from the bash prompt. To get this set up, assuming node.js is already on the machine:

1. Install mssql globally:

> sudo npm install -g mssql

2. Create an ‘/.mssql.json’ configuration file somewhere in your path (or type this connection information object as an argument to mssql every time you use it, your choice).

> echo '{ "user": "sa", "password": "<YourStrongPassw0rd>", "server": "localhost", "database": "AdventureWorks2014"}` > .mssql.json

Or, for a Docker SQL Server container from the Docker host (just like any other SQL Server available on the network at port 1433):

> echo '{ "user": "sa", "password": "<YourStrongPassw0rd>", "server": "172.17.0.1", "database": "AdventureWorks2014"}` > .mssql.json

3. Pipe a query in, get a consistently formatted JSON array of query result out

> echo "select name from sysdatabases"|mssql /home/bwunder/.mssql.json

or – if the .json file’s path is in $PATH or the file is in the current directory use the shorthand:

> echo "select * from sys.databases where name=master" | mssql

But double quotes wrapped in single quotes will fail:

> echo ‘select * from sysdatabases where name=”master"’ | mssql

And multil-line queries break when the author enters that first line-feed.

*Important Note: Installing both the sql-cli package mentioned below and the node-mssql package mentioned above globally creates a global name conflict – the last one installed becomes known as mssql globally at the bash prompt. The last package (re)installed always wins: Installing either package with the -g switch breaks the other even when that other is already installed and working but it is easy enough (thought not a good practice I suspect) to toggle to and fro explicitly with another install of the one you desire now. Internally, sql-cli requires mssql. An npm uninstall of mssql when sql-cli was the last of the two to be installed produces the insightful warning:

bwunder@linux-niun:~> sudo npm uninstall -g mssql
 root's password:
 npm WARN gentlyRm not removing /usr/local/bin/mssql as it wasn't installed by /usr/local/lib/node_modules/mssql

sql-cli

When, instead*, the Apache licensed sql-cli NPM package is installed globally you get an interactive query engine client out of the box. With this package, when you hit enter – each and every time you hit enter – what you typed goes to the SQL Server. Like mssql, you only get one line of typed query text per round trip to the SQL Server. And as with sqlcmd, however, submission by script file, does not carry a limitation on line feeds allowed within the script. It is only the ad hoc command line usage that is limited to query tweets.

~> sudo npm install -g sql-cli
~> mssql -s 172.17.0.1 -u sa -p '<yourstrong!passw0rd>' -d AdventureWorks2014</yourstrong!passw0rd>
 Connecting to 172.17.0.1...done 

sql-cli version 0.6.2
 Enter ".help" for usage hints.
 mssql> .help 
 command             description
 ------------------  ------------------------------------------------
 .help               Shows this message
 .databases          Lists all the databases
 .tables             Lists all the tables
 .sprocs             Lists all the stored procedures
 .search TYPE VALUE  Searches for a value of specific type (col|text)
 .indexes TABLE      Lists all the indexes of a table
 .read FILENAME      Execute commands in a file
 .run FILENAME       Execute the file as a sql script
 .schema TABLE       Shows the schema of a table
 .analyze            Analyzes the database for missing indexes.
 .quit               Exit the cli
mssql> select name from sysdatabases

name
 ------------------
 master
 tempdb
 model
 msdb
 sqlpal
 AdventureWorks2014

6 row(s) returned

Executed in 1 ms
 mssql> .quit

sql-cli also:

  • has a helpful, but limited set of command options for running a script or browsing the catalog with comfortably formatted tabular query results.
  • does indeed depend on mssql but probably not on the latest release of mssql.
  • Includes a few catalog CLI dot commands (e.g., .help) useful to the query developer.
  • works like an sqlcmd -q interactive session except that you never have to type GO in order to send a batch to the SQL Server like you must when using sqlcmd interactively.

The big loss when using sql-cli is bash. That is, the user must leave the sqlcli process in order to work at the bash prompt. With mssql the user is always at the bash prompt.

~

Multi-line ad hoc T-SQL queries are mostly not meant to happen with mssql or sql-cli. sqlcmd will at least let you enter them, even if only in an awkward and not editable buffered line reader kind of way. You will have to decide which you prefer else toggle as needed. When properly configured in an Active Directory Domain, mssql, sql-cli, sqlcmd and bcp on a Linux client should be able to connect to and query a SQL Server on any Windows or Linux Server available to it in that Domain.

Chances are quite good that your node.js oriented tools will be using mssql elsewhere. However, sql-cli feels like a somewhat more useful console command line given the handful of catalog and query tools that come with the interface.

http://

sqlpad

If there is a Chrome browser on the server, sqlpad is a great query tool built for and upon the mssql package. Firefox and sqlpad don’t seem to get along. The sqlpad 2.2.0 NPM package is using mssql v “^3.0.0.0” according to its package.json. Some newer stuff may be missing but the query tool is a pleasure compared to the command prompt and tabular results are oh so much nicer in the DOM in a browser than when echo’d to a terminal. From the drop-down intelli-sence style autocomplete presenting database catalog objects/property selections in the query window hints , a query UI can’t get much easier to work with than sqlpad.

~> sqlpad --dir ./sqlpaddata --ip 127.0.0.1 --port 3000 --passphrase secr3t admin bwunder@yahoo.com --save
 Saving your configuration.
 Next time just run 'sqlpad' and this config will be loaded.
 Config Values:
 { ip: '127.0.0.1',
  port: 3000,
  httpsPort: 443,
  dbPath: '/home/bwunder/sqlpaddata',
  baseUrl: '',
  passphrase: 'secr3t',
  certPassphrase: 'No cert',
  keyPath: '',
  certPath: '',
  admin: 'bwunder@yahoo.com',
  debug: true,
  googleClientId: '',
  googleClientSecret: '',
  disableUserpassAuth: false,
  allowCsvDownload: true,
  editorWordWrap: false,
  queryResultMaxRows: 50000,
  slackWebhook: '',
  showSchemaCopyButton: false,
  tableChartLinksRequireAuth: true,
  publicUrl: '',
  smtpFrom: '',
  smtpHost: '',
  smtpPort: '',
  smtpSecure: true,
  smtpUser: '',
  smtpPassword: '',
  whitelistedDomains: '' }
 Loading users..
 Loading connections..
 Loading queries..
 Loading cache..
 Loading config..
 Migrating schema to v1
 Migrating schema to v2
 Migrating schema to v3 

bwunder@yahoo.com has been whitelisted with admin access.

Please visit http://localhost:3000/signup/ to complete registration.
 Launching server WITHOUT SSL

Welcome to SQLPad!. Visit http://127.0.0.1:3000 to get started

The browser page opens to a login screen. You must create a sqlpad user (sign up) before you can sign in. SQL password entry will come after you sign in when you create connections.

You get some charting abilities with sqlpad too. Nothing fancy, but useful for an aggregate IO, CPU or storage time-series visualization to be shared with colleagues.

sqlpad does seem to loose it’s mind now and again – I don’t know, maybe it’s just me -, but a restart of the app seems to make everybody happy again. SOP

~

It may behoove anyone responsible for a SQL Server for Linux instance to get familiar with sqlpad, mssql and sql-cli as well as rediscover SQLCMD and BCP at the command line, master mssql-conf and find all the tools you can like sqlpad. This is true even if/when the intention is to always use the Microsoft supported Windows client UI (SSMS, Visual Data Tools, etc.,) for all development, testing and database administration. The thing is, we can never know when access at the server console will suddenly become the most – perhaps only – expedient option for an urgent – perhaps essential – access to any given SQL Server instance until that moment is at hand.

Personally, I like sqlpad but don’t really care much for the terminal query experience in any flavor I have tried should no browser be handy. In truth, I had built my own console tool that would at least accept a multi-line query that can be edited, then executed either with sqlcmd over ODBC to get a tabular result or with mssql v4.x over tedious to get JSON results before I ‘found’ sqlpad. I affectionately named my tool sqlpal long ago. That was supposed to represent it’s foundations in SQL Server and the Vorpal node.js CLI from NPM. Get it? sql+pal? Then I started seeing references to SQL-PAL and PAL in the Microsoft vendor documentation concerning the CTP. Turns out SQL-PAL is the name for the interop layer between SQL Server and the Linux kernel or some such thing. Sorry for any confusion.

Sqlpal also includes some core Docker management automation plus the Vantage wrapper for Vorpal that enables a remote ssh like encrypted peer to peer connectivity and locally managed IP firewall among other things. Feel free to check it out if you want your SQL Server CTP to run in a Container: https://www.github.com/bwunder/sqlpal. Docker unfolds a SQL Server for Linux development platform for folks that already know SQL Server but may not necessarily have worked with Linux or Docker much… yet.

Given a bit more refinement of the CTP, SQL Server for Linux in a Container could even perform adequately behind many (most?) apps when released for production use in the near future. In the mean time, I’m going to investigate any possibility to combine sqlpad for the browser and it’s authentication protocols with sqlpal’s Batch caching, query object and script folder hooks for the bash shell because I need something to do.

Posted in node.js | Leave a comment

ballad for a data miner

First you save some tuples, then you lose your scruples
people help you along your way, not for your deeds, but how you say
to take their privacy and freedom, just remind ’em they don’t pay
take their money too on a ploy, use what you’ve learned from loggin’ all day

canto:
Cartesian aggregations, Poisson Distributions
causal correlations and standard deviations
You can’t sell your sexy underwear to little kids who watch TV bears
unless your cuddly cartoon cub convinces them that mom…. won’t… mind…
(con variazioni: money… is… love…, wrong… is… right…, yes… means… no…)

Analyzing, calculating, place your bets, stop salivating
map reduce then slice and dice, this new ‘gorithm is twice as nice
money making, manipulating, paying to play and kick-back taking
suffering fools but taking their payoffs, while spying on staff to cherry-pick the lay-offs

canto

Watching out for number one, what the hell, its too much fun,
to those that worked so tirelessly, Thank You Suckers! but no more pay
So many losers along the way, “stupid people” getting in your way
Is that an angry mob in your pocket? Is that a golden fob that you got on it?

canto

This corporation’s got no conscience and global economies won’t scale
when free is just a profit center and people cheap commodities
who’s choices are all illusions, fed by greed and false conclusion,
who’s purpose is to do your bid, fight pointless wars and clean your crapers, please

canto

vamp till que
First you save some tuples then you lose your scruples
People help you on your way not by your words but how you say

finale
Spent his last day in hole
won’t be going down any more
work for the man, live while you can
won’t be to long before your bound from this land

Posted in Privacy | Leave a comment

ad hoc T-SQL via TLS (SSL): Almost Perfect Forward Secrecy

The day the Heartbleed OpenSSL ‘vulnerability’ [don’t they mean backdoor?] hits the newswires seems an ideal moment to bring up an easy way to wrap your query results in an SSL tunnel between the database server and where ever you happen to be with what-ever device you happen to have available using node.js. (also see previous post re: making hay from you know what with node.js. And PLEASE consider this post as encouragement to urgently upgrade to OpenSSL 1.0.1g without delay! – uh..update March 3, 2015: make that 1.0.1k – see https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2015-02 – and/or node.js v0.10.36 see https://strongloop.com/strongblog/are-node-and-io-js-affected-by-the-freak-attack-openssl-vulnerability/ – geez, will the NSA ever come clean on all the back doors they own?)

Heartbleed is clearly the disclosure of a probably intentional free swinging back door in open source software being poorly disguised as a vulnerability discovered after years in the wild. I’m afraid, “Oh, gee, I forgot to test that…” just doesn’t cut it when you talking about OpenSSL. That just made every one of us that has been advocating for open source software as a pathway toward the restoration of secure computing and personal privacy look like feckless dumb shits: as big o’ fools as those politicians from the apropos other party… You know who I talking about – the dumb-as-craps and the repugnant-ones or something like that… All classic examples of puppet politicians – as are we puppet software engineers – mindlessly serving the ‘good enough’ mentality demanded of today’s necessarily exuberant and young software engineers and as has been incumbent upon politicians throughout the times as humanity slogs this now clearly unacceptable – but nicely profitable for a few – course we travel… toward some glorious and grand self-annihilation – so they need us to believe anyway to justify the terminal damage they inflict upon the planet for self-profit.

In my estimation, the only lesson that will be learnt by proprietary software vendors and open source communities alike from the cardiac damage that OpenSSL is about to endure as a result of this little old bleeding heart will be to never admit anything. Ever. Some things never change.

OpenSSL just might not survive without the accountability that is established through full disclosure – at least about what really happened here but preferably as a community. Preferably a disclosure to provide compelling evidence that nothing else so sinister is yet being concealed. I doubt that can happen without full and immediate disclosure from every individual involved in every design decision and test automation script implemented or used during the creation, development and community review of that software. And I doubt any software organization or community would be able to really come clean about this one because – and I admit this is opinion based mostly on how I have seen the world go ’round over the last 60 years – maybe even a community building foundation of open source software such as OpenSSL can be ‘persuaded’ to submit to governmental demands and somehow also remain bound to an organizational silence on the matter? Prepare yourselves for another doozy from one of the grand pooh-bah – and real bad liars – from the NSA before all is said and done on this one.

May 9, 2014 – So far General Clapper has delivered as expected. On the tails of his April Fools Day admission of what we already knew: the NSA has conducted mass surveillance of  American Citizens without warrant or suspicion for quite a while, he first denied having ever exploited the OpenSSL buffer back door in a bald face lie that he stuck with for maybe a week or three, and now he is merely reiterating on older, but extremely disturbing, tactical right he has claimed before for the NSA to not reveal to even American and ally owners or to American and ally maintainers of open source code or hardware any exploitable bugs known by the NSA. All the owners and maintainers get to know about are the backdoors that were coerced to willingly implement. That is just plain outrageous. A standard for tyranny is established. I guess we should be at least glad that the pooh-bah has been willing to share his despotic rule – at least in public – with first “W” and then Bronco. Hell, Bronco even got us to believe that keeping the pooh-bah on his throne was a presidential decision. We will have to wait and see if he can tolerate Monica Bengazi I reckon.

I wonder if we will ever hear that admission of the ultimate obvious truth that the NSA is covertly responsible for the existence of the OpenSSL back door? This must scare the hell out of Clapper’s inner circle – whoever they might be? Once they are forced to admit the first backdoor it won’t be long before the other US Government mandated back doors to our privacy begin to surface and close. I have no doubt there will be a whole lot more colluding public corporations than just Microsoft, Apple and Google. I know it’s deep and ugly, but honestly have no idea just how deep and ugly. All I can see clearly is that there must be a good reason our Government has made such a big deal out of unrevealed backdoors planted for the Chinese Government in Hauwei’s network hardware…


I made the claim in the title that this technique is using ad hoc queries. That needs some qualification. Queries in the example code below are submitted asynchronously by a node.js https server running at the database server. The query is not exactly ad hoc because you must place the SQL in a text file for use by the node.js https server before starting the node server then you can execute the query from any browser with an IP path to the node server. While there is always a way to get to the text file and edit the query if need be, the idea described here is more useful for those ad hoc queries you run a few times over a few hours or days to keep an eye on something, then might never use again. The https server would only be of importance if there were sensitive data in the query results and you wished to avoid serving it on the network as clear text. If that is true, then the user interface you normally use is a better option where-ever you can use it. The node server lets you see the query result from any device with a browser, or from a ‘private’ browser session of someone else’s device with a browser.

SQLbySSL

An OpenSSL generated key for self-signing else a CA signed certificate on the database server is required before starting node. You could install the key and certificate in the local key repository, but that is not the method used here. Instead, a key and a certificate signing request are generated with OpenSSL. The key and self signed cert are kept in node.js server’s root folder. You may need to ignore an “unable to write ‘random state'” message from OpenSSL from key(1) and cert(3) generation. Keep in mind that when using a self signed certificate you must also click thru a browser warning informing you that the certificate is not signed by certificate authority (CA). A few modern browsers will not allow you to click through this screen so will not work here – stock Chrome, Firefox, Android and Safari work just fine. Also keep in mind that anyone that can get your key and certificate can decipher a cached copy of any bits you shoved through SSL tunnels built with that key and certificate. Guard that key closely.

three ways to self signed certificate that will encrypt a TLS1.2 tunnel
1. prompt for file encryption phrases and distinguished name keys
  //genrsa and similar are superceeded by genpkey openssl genrsa -out key.pem 1024
  openssl genpkey -algorithm RSA -out key.pem -pkeyopt rsa_keygen_bits:1024
  openssl req -new -key key.pem -out request.csr
  openssl x509 -req -in request.csr -signkey key.pem -out cert.pem 

2. no prompts - your distinguished name (DN)
  openssl genpkey -algorithm RSA -out key.pem -pkeyopt rsa_keygen_bits:1024 -pass pass:keyFileSecret
  openssl req -new -key key.pem -passin pass:keyFileSecret -out request.csr -passout pass:certFileSecret -subj/DC=org/DC=YABVE/DC=users/UID=123456+CN=bwunder -multivalue-rdn
  openssl x509 -req -in request.csr -signkey key.pem -out cert.pem 

3. one command - no request file - no prompts
  openssl req -x509 -newkey rsa:1024 -keyout key.pem -out cert.pem -passin pass:keyFileSecret -passout pass:certFileSecret -days 1 -batch 

The key used to generate the request is used to sign the request certificate . Certificate and key are saved as .pem files in the node.js server folder. You could even roll-ur-own perfect forward secrecy. That is to say, automate the generate and signing of a new key before every request. Not quite perfect but this could allow you to keep going in ‘manual mode’ with or without an urgent upgrade to close a risk that is not considered a risk when using perfect forward secrecy – at least until perfect forward secrecy is rendered ineffective in a few years.

Adding the one command key generation as the “prestart” script in the node’s  package.json will get you a new key each time you start the nodejs server.

You may need to allow inbound TCP traffic on the port serving SSL pages (8124 in the example) in your firewall if you want to hit the query from your smart phone’s browser or any remote workstation that can ping the database server on the port assigned to the https server – and present your Windows domain credentials for authentication unless you hardcode a SQL login username/password in the connection string (not recommended).

Speaking of which, Edge expects to find a connection string to the SQL Server in an environment variable where the node.exe is called before the node process thread is started.

SET EDGE_SQL_CONNECTION_STRING=Data Source=localhost;Initial Catalog=tempdb;Integrated Security=True

Lastly, when the node server is started you will be prompted at the console to enter a PEM password. It is not clear from the prompt but this is the phrase you used to encrypt the certificate file. I used ‘certFileSecret’ in the example above.

Happy Heartbleed day!


/*
  npm install edge
  npm install edge-sql
*/
var edge =require('edge');
var sys =require('sys');
var https =require('https');
var fs =require('fs');

var port = 8124;
var options = {
  key: fs.readFileSync('./key.pem'),
  cert: fs.readFileSync('./cert.pem')
};

var sqlQuery = edge.func('sql', function () {/*
  SELECT top(10) qs.total_worker_time AS [total worker time]
     , qs.total_worker_time/qs.execution_count AS [average worker time]
     , qs.execution_count AS [execution count]
     , REPLACE(
         SUBSTRING( st.text
                  , ( qs.statement_start_offset / 2 ) + 1
                  , ( ( CASE qs.statement_end_offset
                        WHEN -1 THEN DATALENGTH( st.text )
                        ELSE qs.statement_end_offset
                        END - qs.statement_start_offset ) / 2 ) + 1 )
         , CHAR(9)
         , SPACE(2) ) AS [query text]
  FROM sys.dm_exec_query_stats AS qs
  CROSS APPLY sys.dm_exec_sql_text( qs.sql_handle ) AS st
  ORDER BY total_worker_time DESC;
*/});

// listens for query results to dump to a table
https.createServer( options, function( request, response ) {
  sys.log('request: ' + request.url );
  if ( request.url==='/' ) {
    sqlQuery( null, function( error, result ) {
      if ( error ) throw error;
      if ( result ) {
        response.writeHead( 200, {'Content-Type': 'text/html'} );
        response.write('&lt;!DOCTYPE html&gt;');
        response.write('&lt;html&gt;');
        response.write('&lt;head&gt;');
        response.write('&lt;title&gt;SQLHawgs&lt;/title&gt;');
        response.write('&lt;/head&gt;');
        response.write('&lt;body&gt;');
        response.write('&lt;table border=&quot;1&quot;&gt;');
        if ( sys.isArray(result) )  {
          response.write('&lt;tr&gt;');
          Object.keys( result[0] ).forEach( function( key ) {
            response.write('&lt;th&gt;' + key + '&lt;/th&gt;');
          });
          response.write('&lt;/tr&gt;');
          result.forEach( function( row ) {
            response.write('&lt;tr&gt;')
            Object.keys( row ).forEach( function( key ) {
              if (typeof row[key]==='string'&amp;&amp;row[key].length &gt;=40 ) {
                response.write('&lt;textarea DISABLED&gt;' + row[key] + '');
              }
              else {
                response.write('&lt;td&gt;' + row[key] + '&lt;/td&gt;');
              }
            });
            response.write('&lt;/tr&gt;');
          });
        }
        else  {
          Object.keys( result[0] ).forEach( function( key ) {
            response.write( '&lt;tr&gt;&lt;td&gt;' + key + '&lt;/td&gt;&lt;td&gt;' + result[0][key] + '&lt;/tr&gt;');
          });
        }
        response.write( '&lt;/table&gt;' );
        response.write( '&lt;/body&gt;' );
        response.write( '&lt;/html&gt;' );
        response.end();
      }
      sys.log(&quot;rows returned &quot; + result.length)
    });
  }
}).listen(port);

sys.log('listening for https requests on port ' + port);

Posted in Privacy, Secure Data | 3 Comments

Making JSON Hay out of SQL Server Data

Moving data in and out of a relational database is a relentless run-time bottleneck. I suspect you would agree that effecting metadata change at the database is even more disruptive than the run of the mill CRUD. I often hear the same [straw] arguments for a new cloud vendor or new hardware or new skill set or a code rewrite to relieve throughput bottlenecks. But what if what you really need is a new data store model? What if you have been building, and rebuilding, fancy databases as cheaply as possible on scenic oceanfront property beneath a high muddy bluff across the delta of a rushing river from a massive smoking volcano in the rain? That is to say, maybe an RDBMS is not quite the right place to put your data all [most?] of the time? Maybe… just maybe… SQL Server – and Oracle and PostgreSQL – are passé and the extant justifications for normalization of data are now but archaic specks disappearing into the vortex of the black hole that is Moore’s Law?

On the off chance that there is some truth to that notion I figure it behooves us to at least be aware of the alternatives as they gain some popularity. I personally enjoy trying new stuff. I prefer to take enough time examining a thing so that what I am doing with it makes sense to me. In late 2012 the open source MongoDB project caught my attention. I was almost immediately surprised by what I found. Intelligent sharding right out of the box for starters. And MongoDB could replicate and/or shard between a database instance running on Windows and a database instance running on Linux for instance, or Android or OSX [or Arduino or LONworks?]. And there was shard aware and array element aware b-tree indexing, and db.Collection.stats() – akin to SQL Server’s SHOWPLAN. Even shard aware Map-Reduce aggregations so the shards can be so easily and properly distributed across an HDFS – or intercontinental cluster for that matter – with ease. And tools to tune queries!  I was hooked in short order on the usability and the possibilities so I dug in to better understand the best thing since sliced bread.

The “mongo shell” – used for configuration, administration and ad hoc queries – lives on an exclusive diet of javascript. Equally easy to use API drivers are available from MongoDB.org for Python, Ruby, PHP, Scala, C, C++, C#, Java, Perl, Erlang and Haskell. There is more to the API than you find with the Windows Azure or AWS storage object, or Cassandra or SQLite for that matter, but still not as much complexity for the developer or waiting for results for the user as is invariably encountered with relational models.

In the course(s) of learning about the API and struggling to remembering all the things I never knew about javascript and the precious few things I never knew I knew about javascript I found myself working with – and schooling myself on – node.js (node). Node is a non-blocking single threaded workhorse suitable for administrative work and operational monitoring of servers, smartphones, switches, clouds and ‘the Internet of things‘.  The mongo shell is still the right tool for configuration, indexing,  testing and most janitoral grunt work at the database. Unlike node, the shell is not async by default and not all of the lowlevel power of the mongo shell is exposed through the APIs. Nonetheless, node uses the native javascript MongoDB API. And I must say that having the application and the db console in the same language and using the exact same data structures is huge for productivity. Minimal impedence for the DBA to force the mental shift between mongo server and node callbacks. Virtually no impedance for the developer to mentally shift between app layers and data layers!

Perhaps node.js is a seriously excellent cross platform, cross device administrative tool as I believe, but I can only guarantee that it is fun. It is an open source environment with potential beyond any I can imagine for Powershell or ssh. Packages that expose functionaity through javascript libraries and written in  C, C++, C#, Java and/or Python   exist for connection to node. node makes no bones that MongoDB is the preferred data store. I am by no means a data store connoisseur though I have snacked at the NoSQL corner store and stood in the Linux lunch line often enough to feel entitled to an opinion. You’ll have to take it from there.

FWIW: 10gen.com, MongoDB’s sponsor corp has a real good 6 lesson course on-line for kinesthetic learners that will give you a journeyman skills with MongoDB and get you started with node.js. Mostly hands on so there is a lot of homework. And its free.

Update April 22, 2015: While checking out the otherwise mediocre Node js Jump Start course on the Microsoft Virtual Academy I did come upon another resource that might be better suited to the visual learner: The Little MongoDb Book by Carl Seguin available as a no-cost pdf – or in low-cost hardcopy at your favorite book store. 

To otherwise help ease your introduction – if you decide to kick the MongoDB tires using node.js and you are a SQL DBA – I provide an example data migration below moving a SQL Server model into a MongoDB document that you can easily set up locally or modify to suit your data. Most of the work will be completing the download and installation of the few open source software libraries required.

For data here I use the simplified products table hierarchy from the AdventureWorksLT2012 sample database available from codeplex. product is easily recognizable as what I will call the unit of aggregation.

The unit of aggregation is all data that describes an atomic application object or entity. From the already abstracted relational perspective one could think of the unit of aggregation as everything about an entity de-normalized into one row in one table. In practice, many relational data models have already suffered this fate to one extent or another.      

In the AdventureWorksLT database I see three candidates for unit of aggregation: customers (4 tables), products (6 tables) and Sales (2 tables, parent-child – the child is probably the unit of aggregation). Product is interesting because there are nested arrays (1 to many relationships) and a grouping hierarchy (category). Here is a diagram of the SQL Server data:

ProductsDbDiagram

This is loaded to into a collection of JSON documents of products with the following JSON format. The value (right) side of each ‘name -value’ pair in the document indicates the table and column to be used from the SQL data.

[ 
  {
    _id : Product.ProductID,
    Name: Product.Name,
    ProductNumber: Product.ProductNumber,
    Color: Product.Color,
    StandardCost: Product.StandardCost,
    ListPrice: Product.ListPrice,
    Size: Product.Size,
    Weight: Product.Weight,
    SellStartDate: Product.SellStartDate,
    SellEndDate: Product.SellEndDate,
    DiscontinuedDate: Product.DiscontinuedDate,
    ThumbNailPhoto: Product.ThumbNailPhoto,
    ThumbNailPhotoFileName: Product.ThumbNailPhotoFileName,
    rowguid: Product.rowguid,	
    ModifiedDate: Product.ModifiedDate,
    category: 
      {
        ProductCategoryID: ProductCategory.ProductCategoryID,
        ParentProductCategoryID : ProductCategory.ParentProductCategoryID,
        Name: ProductCategory.Name,
        ModifiedDate: ProductCategory.ModifiedDate 	
      }
    model:
      {
        ProductModelID: ProductModel.ProductModelID,
        Name: ProductModel.Name,
        CatalogDescription: ProductModel.CatalogDescription ,
        ModifiedDate: ProductModel.ModifiedDate 	
        descrs: 
          [
            {
              ProductDescriptionID: ProductModel.ProductDescriptionID, 			
              Culture: ProductModelProductDescription.Culture,
              Description: ProductDescription.Description,
              ModifiedDate: ProductDescription.ModifiedDate 	
            }
            ,{more descrs in the square bracketed array}... 
          ]
      }
   }
   ,{more products - it's an array too}... 
 ] 

The code is executed from a command prompt with the /nodejs/ directory in the environment path . I am using node (0.10.25) on Windows Server 2012 with SQL Server 2012 SP1 Developer Edition at the default location and MongoDB 2.2.1 already installed prior to installing node. SQL Server is running as a service and mongod is running from a command prompt. I am using only Windows Authentication. For SQL Server access I am using the edge and edge-sql npm packages. edge asynchronously marshals T-SQL through the local NET framework libraries and returns JSON but only works with Windows.

( April 3, 2017 update: have a look at the mssql package on NPM. Microsoft is the package owner and the output from the database is JSON by default. Works with Node.js on Linux, iOS and Windows )

edge-sql result sets come back to the javascript application as name-value pairs marshaled from a .NET ExpandoObject that looks and smells like JSON to me. The work left after the queries return results is merely to assemble the atomic data document from the pieces of relatioinal contention and shove it into a MongoDB collection. This all works great for now, but I am not totally convinced that edge will make the final cut. I will also warn you that if you decided to start adapting the script to another table hierarchy you are will be forced to also come to understand Closures and Scope in Javascript callbacks. I hope you do. It’s good stuff. Not very SQLish though.

/*
  npm install mongodb
  npm install edge
  npm install edge-sql

  edge expects valid SQLClient connection string in environment variable
  before node is started.
  EDGE_SQL_CONNECTION_STRING=Data Source=localhost;Initial Catalog=AdventureWorksLT;Integrated Security=True

  edge-sql is a home run when aiming SQL data at a JSON target because
  you just supply valid T-SQL and edge will return the ADO recordset
  as a JSON collection of row objects to the scope of the query callback.   	

  edge-sql for a production system is a strike out (w/backwards K)
  1. returns only the first result requested no matter how
    many results produced
  2. the javascript file containing edge.func text is vulnerable to
    SQL injection hijack by adding a semicolon followed by any valid T-SQL
    command or statement provided first word in edge.func callback comment
    is insert, update, delete or select (not case sensitive)
  STEEEEERIKE 3. the connection is made with the security context of the
    Windows user running the script so database permissions and data can
    be hijacked	through an attack on the file with an edge.func('sql')
*/

var edge = require('edge');
var mongoClient = require('mongodb').MongoClient;

var mongoURI = 'mongodb://localhost:27017/test';

// named function expressions (compile time)
// you paste tested SQL queries in each functions comment block
// edge parses it back out and executes as async ADO
var sqlProductCursor = edge.func( 'sql', function () {/*
  SELECT  ProductID
        , ProductCategoryID
        , ProductModelID
   FROM SalesLT.Product;
*/});
var sqlProduct = edge.func( 'sql', function () {/*
  SELECT  ProductID AS _id
        , Name
        , ProductNumber
        , Color
        , StandardCost
        , ListPrice
        , Size
        , Weight
        , SellStartDate
        , SellEndDate
        , DiscontinuedDate
        , ThumbnailPhoto -- varbinary MAX!
        , ThumbnailPhotoFileName
        , rowguid
        , ModifiedDate
  FROM SalesLT.Product
  WHERE ProductID = @ProductID;
*/});
var sqlProductCategory =  edge.func( 'sql', function () {/*
  SELECT ProductCategoryID
       , ParentProductCategoryID
       , Name
  FROM SalesLT.ProductCategory
    WHERE ProductCategoryID = @ProductCategoryID;
*/} );
var sqlProductModel = edge.func( 'sql', function () {/*
  SELECT ProductModelID
        , Name
        , CatalogDescription
        , ModifiedDate
  FROM SalesLT.ProductModel
  WHERE ProductModelID = @ProductModelID;
*/});
var sqlProductModelProductDescription =
  edge.func( 'sql', function () {/*
    SELECT 	pmpd.ProductDescriptionID
          , pmpd.Culture
          , pd.Description
          , pd.ModifiedDate
    FROM SalesLT.ProductModelProductDescription AS pmpd
    LEFT JOIN SalesLT.ProductDescription AS pd
    ON pmpd.ProductDescriptionID = pd.ProductDescriptionID
    WHERE ProductModelID = @ProductModelID;
*/});		

mongoClient.connect( mongoURI, function( error, db ) {
  db.collection('products').drop();
});	 

mongoClient.connect( mongoURI, function( error, db ) {
  sqlProductCursor( null, function( error, sqlp ) {
    if ( error ) throw error;
    for (var i=0; i &lt; sqlp.length; i++) {
      ( function (j) {
          sqlProduct (
            { "ProductID" : j.ProductID },
            function ( error, product ) {
              sqlProductCategory (
                { "ProductCategoryID" : j.ProductCategoryID },
                function ( error, category ) {
                  sqlProductModel (
                    { "ProductModelID" : j.ProductModelID },
                    function ( error, model ) {
                      sqlProductModelProductDescription (
                        {	"ProductModelID" : j.ProductModelID },
                        function ( error, descrs ) {
                          model[0].descrs = descrs;
                          product[0].category = category[0];
                          product[0].model = model[0];
                          db.collection('products').insert( product ,
                            function( error, inserted ) {
                              if (error) throw error;
                       		  });
                        });	// descrs
                    }); // model
                  }); // category
            }); // product
        })(sqlp[i]); // product closure
      }
    });
});	 

That’s all there is to it.

Posted in NoSQL | Leave a comment

Background Checks for EVERYBODY!

(last edit: Sept 29, 2018)

A background check simply filters and formats personal information about an eating, breathing person into a somewhat standard and therefore, presumably, useful “packet”. Much of the information in a background check is already out there in the public domain. Most of the rest is already controlled and/or owned by the government. What is missing is the true and just application of filters and formats – the so-called algorithms – needed to steadily organize said “packets” as in any way beneficial to humanity.  

Oh, there are algorithms in use. Primarily those developed and implemented by and for  marketers, but also governments and other well-funded cartels; though we are offered no transparency, and the accounts that do manage to “leak” to within the public’s earshot suggest any claimed good intention is at least dubious and too often, just another scene in an endlessly bad Keystone slapstick of corrupt, often evil, buffoonery.

And the data is clearly out there and being indexed by every Tomas, Dawud and Khariton on the planet now. Anyone that has never done so might be shocked to see what anyone can learn about anyone at the many web sites that openly traffic in other people’s background information. Much of the data is available simply by investing the time and absorbing the security breach it takes to do a search: no money required. Anyone willing to pay a low price can look even deeper into your background in even less time from anywhere on the planet, and all without your permission or knowledge. If you are OK with that, you might be a moron. To be sure, there are scads of shady web sites where anybody in the world can pay a byte of bitcoin to see a disturbing amount of information about you. That is to say: get enough of your Personally Identifiable Information (PII) to steal your identity without leaving a trace.

But the issue cannot be reduced much by simply going after these often legally acceptable but obviously checkered web sites. Data misuse – whether it has a web address or clandestine – is endemic in crapitalism.  

Consider the place(s) that sold and shipped thousands of rounds of ammo to a disturbed person that would later commit mass murder, even after that person had reached out to mental health professionals for help. I can only wonder if methodical determination within the health care system and, in parallel, the public safety agency(s) to keep this guy off the street as they attempted to stabilize his personality would have been enough to prevent the deaths of innocent movie-goers in Aurora Colorado. I’m betting it would if done by the book. Sadly, in the US anyway, the main purpose of our crapitalist run health care system is to make exec’s filthy stinking rich and secondarily to extract enough in every transaction to assure a return for shareholders and thereby persist the ruse of exploiting human need to satisfy human greed.

The public facing agencies of safety – I hesitate to call them law enforcement agencies because enforcement of laws just doesn’t happen much at this time – are occupying military forces in our cities, and clearly hold priorities based on personal and institutional biases tilted against the public’s interests, safety or security. Leadership at the department, precinct or station is almost entirely by those with low-level military officer backgrounds and staff is ever more increasingly drafted from among the same military veterans that as easily can find themselves homeless and abandoned by the government they served if they show signs of resistance to blind obedience. FWIW, the PTSD afflicted can as easily get a locker at the precinct as get one hot meal a day at the JESUS+SAVES.  So the military becomes the filter for who has enough ice in their veins to get one of the high risk, low effort but OK paying cop jobs and who is cast aside. Military service destroys the young lives of the more sentient among the recruits and tries, albeit in a half-baked way, to cull them long before they can get on the street with the intention to use badge and bullet to do good in a community. The message: Doing good – dumb & bad, following orders – smart & good. The implication: you take care of us and we will at least pretend to take care of you. Lets face it, the number of those on a government payroll with the authority to take lives regardless of citizenship – or those tasked with managing the governments money each grossly exceed the number tasked with preserving or bettering the lives of citizens.     

US Government agencies that mostly stock the bench with those same flavor of minions, I’m thinking of the IRS, NSA, FBI, CIA and ATF; and increasingly among the social-media/industrial surveillance engines as well as the myriad of cookie powered marketing and transaction data crapitalists, not to mention the massive, and generally knowable-able to the stealthy, archives of banks, insurance companies, retailers and wholesalers that routinely accumulate giga-scads of data useful to check a person’s background.

Ever so slowly, the players are sharing select bits of this information to make a buck but so far everyone still totally sucks at cooperating to render this information useful to humanity. Opting instead toward driven crapitalistic self-interest (e.g., criminality, manipulation of the public, ever-increasing profits, economic rent, increased return on investment, bonus to self, cash flow, fear of reprisal, redress, revenge, litigation, bribes, etc.).

As I had previously blogged about Government Agency’s mandates for unfettered, unquestioned and, as I remain convinced, un-American, unneeded and wholly ineffective access to ‘pen and tap’ data in every data center and central office in the country. What they cannot take for the asking or with a little sometimes heavy-handed coercion, they just take. And in the process the mandated pathways needed to get to the data are irrevocably thrown open to all comers.

New Government regulations intended to help actually make the situation worse. HIPPA, for example, defines standards for storing health information and compels health care providers to put your medical records in a supposedly secure but technically more share-able electronic form. Combine that with the now widely suspected to be intentionally made hack-able by government agency in multiple countries cryptography. Downstream government agencies are surely licking their chops over this pool of easy data and working quietly behind the scenes to make sure what-ever happens, the backdoor will be flung open for them on any whim. SSL Security is all good and fine. Would be even better if it actually worked though, and what we know is we are not getting that job done. So why isn’t there even a plan to be of genuine service to the communities this data is about? That’s all I want to know. Can there no turning back toward truth from here?

Government data proper can often be classified as “public information”. Yet we the public have to know whom to ask or pay or seduce, and know the secret word and have the proper de jour political alignment – to see it. Even among and between government agencies there exists only ineffective mandates and poor compliance to weak models to effectively and mutually share data. Brilliant!

Stores routinely collect video surveillance and customer transaction histories. Each will process, archive, aggregate and perhaps share the collected data in what-ever ways have been ‘decided’ by the executive management.

Many 24-7 news operations exist at the local, national and international levels. We know they don’t simply overwrite yesterday’s news with today’s news. It accumulates and is indexed and/or cataloged by date and some variant of keyword. Has for all our lifetimes. If you made the news at any time in your life, it is in that archive and can be recalled by someone. Guaranteed.

It strikes me that all the data for good background checks is already out there. What is missing are the means of using this already collected, often already aggregated data so that the right decisions is made at the right time: the algorithm. Even worse, the impetus for the politicians that would have to drive the implementation remains too much about what not to tell the constituency. For them, the transparency targeted would be a threat to their elect-ability. It would force the merging of public persona and the real person behind the curtain. Thus, there is no prospect for a fair and just background checking algorithm in our crapitalist economy.

The already ubiquitous Artificial Intelligence (AI) and Business Intelligence (BI) tools in union with the data already being collected are enough to carry out useful algorithms to decide such things as who should or should not buy a gun – and help with implementation planning that doesn’t end in a shoot-out well before the first shot has been volleyed. All that is lacking is a collective will to get it done.

A background check must assure the greatest measure of accuracy, accountability and transparency possible from the body of information about a person that is already in the lawfully sharable domains. With transparency in background checks anyone and everyone would be able to find all background information about themselves, determine where that information first appeared in the background, and the history of all previous reads. The person could then work within the established framework(s) to correct any errors should they show up in their own background. A person would get a from the facts background check with each entry or update into the sharable data set.

The algorithm could also be applied on demand by authorized users or upon the request of the background’s subject. Furthermore, physical, emotional, and mental health warning signs could be followed up as thresholds of concern are in peril and not as interventions of greater concern and far less hopeful prospects following calamities. We can also expect identifying warning signs fidelity to improve exponentially as the body of background data becomes integral.

Make no mistake, EVERYBODY’s PII data is out there in the wild now. The crapitalist tyranny is that the data, who uses it and how are lawfully kept from that person’s view and beyond their control. It can easily be used to persuade or deceive that person and as easily be manipulated – by someone who may or may not know anything about that person – to deceive other’s. This data must be openly available for view and stubbornly resistant to change or it will be uncontrollably compromised in unknowable ways like Facebook. PII that is open, as souce code is open, is most secure by design.

Don’t Tread on Me

Still and all, imagine how anyone might react when they ‘fail’ a background check when trying to buy a gun. Especially if they might already suffer with mental health issues or are already wanted for a crime or are trying to add to their existing arsenal.

Point-of-sale background checks that actually establish if a person can buy guns could easily make a bad situation worse – at least some of the time. Particularly if doled out as a pass or fail ticket awarded to one citizen waiting in line but not the next, especially if the ‘fail’ meant “no gun today” and the denied person also brought an intention to do harm along with them to the gun store. This reveals an essential aspect of self-interested crapitalism: pitting a low-level front line wage earner as the enforcer against the customers with a wallet full of power the crapitalist seeks.  Couple that with the fact that most gun buyer’s in the US today are not buying their first gun: they are already armed. And don’t forget, an unknown number of guns trade hands at gun show, in private and on black market exchanges.

The notion that the solution is for everyone to arm themselves is the best or only solution is pure crapitalistic marketing horse shit. When going after a problem where too many people are dying because too many people have guns, the most laughable solution is to push more people to get a gun. Quite clear who would stand to benefit in that case though, isn’t it.

Imagine, from yet another perspective, how it might affect the vote if voters could query candidate background data only to find that a favorite politicians had failed to disclose a long history of mental illness or abuse or even which – if any – of the judges you now thoughtlessly approve at each election were blatantly corrupt or known prescription drug abusers or had sexually assaulted teen girls while blackout drunk as a young man and now would plow over anybody who said so rather than admit the truth.

Imagine if those responsible for hiring our teachers and police had similar information when evaluating teaching candidates or even the longest tenured educators. And imagine we had the same information about teachers and police as the people who hire teachers and police? And teachers and police had the same information about us – and those that hire them. And we had access to that information about our Doctor or car mechanic or date? Shouldn’t everyone be exposed to the same depth of scrutiny? NRA spokes person Wayne La Pierre? Obama? Trump? You? Me?! It’s actually way too late to dicker over who should get such scrutiny. It’s happening now to everyone but not equally and without transparency or proper oversight. It’s also well protected by widespread denial among those with adequate power and record profits among crapitalists.

Everybody Gets a Second Chance

I do wonder what would become of those of us who don’t meet the background check sniff test at some time in our lives. After living in this society where people prefer not to know their neighbors all my life, that truth could be so disturbing that even greater horrors and mischief ensue. So many are now armed with assault weapons and so few police have the training and skills required to recognize let alone counsel a person safely through a mental health crisis that there probably is not a lot anyone can do about guns already in the wild for generations without something as drastic as a massive weaponized domestic drone campaign to ‘take away the guns’ from the cold dead droned hands of those labeled “should not have guns”. More realistically, mental health interventions will remain a point of difficulty that will require trained and skillful councilors, clergy and communicators. The good news is we could greatly lower acts of violence before and during interventions as we learn how to intervene sooner upon recognition of less onerous conditions and behaviors – or whatever are the right markers to be measuring in the algorithm.

It is beyond belief that the political system has even been discussing a plan that could so obviously end with a “take his guns away” order followed by desperation and, too often if happens only once, a shoot-out. Ordered revocation will not work any better than chasing the homeless out-of-town (again) or a prohibition on alcohol or tariffs on foreign commodities. Background checks only after a person has declared an intention to buy a gun from a lawful gun dealer is a half-way measure. Anybody seeking to avoid the check has other ways to buy the same weapons. It is a bozo mistake to tie background investigations to product sales.

And that seems to be where the conversation is now stuck in perpetuity. Some folks seem to believe that background checks are a waste of time and won’t help a thing. Others believe background checks are not enough and even more rules and regulations are necessary, convinced that the type of weapon foretells a propensity for misuse. Meanwhile the politicians provide us the predictable dis-services of misinformation, stonewalling and nonsense in the hope that nothing changes other than an ever increasing number of zeros in their bank balance – and maybe a little more stuffing in the easy chair. We need to collect and comprehend data to get this right. Beliefs may bring change, but it will always be hit or miss without the facts. We only get facts through the collection and deep comprehension of data.  

To get the conversation moving perhaps we have to also stop blaming any narrow slice of the population (e.g., gun buyers, disturbed people, terrorists, tree huggers, religious fundamentalists or politicians) as the source of this or any other systemic problem and in so doing, erroneously see prevention as the elimination of certain stereotypical yet otherwise lawful type of abiding persons from the population. As far as I can tell, human diversity, spiritually, emotionally and mentally cannot be thwarted. In extension, criminals, haters and lunatics will be among each generation and will find ways to do their deeds whether they can legally buy a gun or have the proper training to use a gun.

Our goal must be a backgound checking algorithm that ameliorates the aberrant behaviors before there becomes an awful headline of death and disaster. Continuous cradle-to-grave background checks coupled with an army of well-qualified trans-personal counselor to reach out to others regularly to help us understand what our background check is signaling is essential. Like your body and your teeth, does not your mind deserve a regular checkup? Unfortunately, those in positions of power would place too much of their power at risk under such a structure, especially were it based upon transparency simply because EVERYBODY has a few “red flags” in their background. And it doesn’t help much that money or power can so easily change the narrative in spite of the truth. We would probably need a brutal generation of holding politicians accountable to the standards of transparency and routine that needs to be applied to everyone before they figure out how to be honest and upstanding human beings again.

Many polls suggest the American public is apparently increasingly behind the need for better background checking of potential gun owners. Quinnipac polling in June 2017 indicated 94% favor background checks. CBS News in June of 2016 had  the number at 89%. Other polls in the last year fall as low as 84%. 

Circumspectly, it seems to me like everyone in the county is a potential gun owner and can easily circumvent ATF regulations just like always. Doesn’t that potential technically make EVERYONE a candidate for a pre-purchase background check? And since anyone can buy a gun at any time isn’t it important to keep that background check up-to-date? It could be entirely possible to buy a gun anytime, anywhere, and why on earth should we wait until someone is buying a gun to help them with health issues that probably could much less obtrusively and more usefully have been noticed, discussed and worked through long before that time? Is it really true that people bent on doing evil will find a way if they have a gun or not? Should we be doing all we can to protect our communities by prevention from those risks easily uncovered through background checks? Could be some mighty tough questions for the politicians who are afraid of their own shadow.

All political systems of our time may be already too corrupt to give even lip service to a solution based upon transparency, accountability and human dignity. That is but one disadvantage of the bought-and-paid-for oligarchy we now suffer and the financially driven, politically biased media that would so quickly lose ratings if accountability and transparency of background checks were the rule for everybody. Probably, politicians, bankers and corporate executives would be the biggest losers. A reasonable compromise among all or even most elected officials is not possible in this time when everything but including Wikileaks happens behind closed doors. Background checks must be open and transparent to be beneficial to humanity. Backroom deals bargain away the bounty of all for the self interests of the bungling politicians and their buddies, brothers and bankers. What exactly is wrong with standing before your peers and community as the person you are? Are we all that ashamed of the truth? Isn’t such self-deception the tether evil needs to grow its vine?

Background checks are already as much a cliche as death and taxes. The problem is current background checks are sloppy, incomplete, inconsistent, usually clandestine and always susceptible to malevolence. I bet city cops, for example, do get a much different pre-employment background check than seasonal city workers. But not at all am I convinced the cop gets the better or more informative check of the two. And we have two recent cases of Supreme Court Justice appointments where adequate background investigations concerning boat loads of credible accounts of sexual deviance that were traded, in the partisan political push for Senate advise and conscent, for a burning car of clowns.

What we really need to do is less complicated:

1. agree on what needs to be in a background check (guaranteeded, especially the first round or two, to be partisan burlesque the likes of which we have never seen. I can tell you that. And it will need to be revisted for revision soon and often.)

2. go about the business of compiling and checking backgrounds consistently using the agreed algorithm throughout the population.

What ever data is needed, I have no doubt that the US Government already has more than enough access to personal information to begin this work. 99.99% of us would not need to lift a finger or change a thing. The Generals in charge of this access do not need anything more than the rubber OK stamp they already have from the politicians that we have given voice. But instead of doing the work we are witnessing a very different militarization of law enforcement in this county. A militarization apparently meant to entertain as it subjugates; to show the people they have nothing to fear as long as the do as they are told (and are white as of 2016).  A militarization that has not been instrumental in making us safer but showcases a janitorial law enforcement portrayed in dramatic backdrop by clever camera operators and video editors supporting well spoken narrators, all seeking to compose the most dangerous or scary looking scene among the many “news teams” recording the official “truth” while sequestered against the brick wall of some shop 5 safe blocks from the closest ‘action’.

Consider the Boston Bombing. Authorities failed in every way to secure the event or isolate a risk for backpack bombs scattered about the finish area as a real risk before the instant of attack. The level of treat was well known but not, apparently, a specific known risk found on background, nor any preparation or screening to thwart any attack. Too hard I reckon – and that is my point exactly. They had to rely upon video after the fact, collected tediously from each local business that had already been compelled by self-interest to install surveillance camera’s to thwart [lesser?] crimes of property theft to help identify the bombers. It took all the next day for authorities to wade through this data and come up with a couple of freeze frames of video deemed ‘safe’ enough, (i.e., absolutely no hint of law enforcement acting unlawfully and as few faces of innocents as are easily cropped out or obscured)  to release to the public. Then authorities had to crowd source the identities of those shown in the images on the evening news. But just enough picture to get names? All the while, authorities maintained a strangle-hold on the media covering the event. This demonstrates once again – and without any doubt in my mind anyway – that the government has the technology and the data access to quickly and extensively check any person’s background once they have the person’s name and photo. Unfortunately the late and well orchestrated photo release backfired when the bombers saw on TV that their covers ware blown and so thought it best to make a run for it. duh. That could have been the end of it but one of the two escaped after firing 200+ rounds plus lobbing two bomb in a battle with police en route out-of-town. The authorities then continued with a heavily armed show of military force as they spent a long hard fruitless day searching door to door to door to door in fatigues and combat gear. The camouflage, automatic weapon toting combat cops and armored vehicles in Boston were all over the TV in Colorado that day. And terror reigned all day. I was certainly hypnotized to and horrified by the media coverage. Finally the authorities called off the search empty-handed at the end of the day and vacated a ‘sheltering in place’ order for a million people in Boston – even though it was not known to be any more or less safe. But involving the people was once again that little flash of transparency the situation needed. A citizen found the bomber holed up in the back yard of his home, well outside of the search area. Hours later the bomber was finally arrested – but only after government agencies had unloaded a couple of clips from police assault weapons into the hiding place at what turned out to be an already unarmed and already badly wounded murderer. All the next day the spokespeople of the various authorities took turns telling the TV camera what a great job the authorities had done.

Imagine if the two brothers that this committed evil, who were apparently radicalized after immigration to the US, had entered the background algorithm’s data set at the same moment they entered the country. Could that cop killing shootout have been avoided? Could the bombing have been put down before it began? Could the radicalization been noticed while it was starting and could an intervention by someone with only the intention to help these boys find other choices and understand what was being done to them have been the only costs to forestall the entire tragic sequence of rapidly unfolding events? Before Trump anyway, immigrants could get food stamps – and in some places, limited medical health care – if eligible. But if isolated from the community, as is usual, they are simply forgotten as they struggle in a new land with a different language as most will experience, all bets on what happens to them after that are off. At least some of the people who reach out to them will be militant and radical, have an agenda against the US and have been trained – I use the term loosely – on how to  befriend newbies in the mother tongue and convince them, using the familiar from the homeland, against some aspect of this new land: radicalization is begun. 

Now that the event seems past I can only wonder how much faster this tragedy could have been brought to a conclusion if even the police were transparent and accountable instead of operating as splintered secret military and para-military operations? Or will subsequently do anything that might improve their chances of preventing bags full of bombs from being spread around the finish line of the next frivolous event that attracts a big crowd? Oh sure, they will flock to the next give-away of excess military equipment, but how does that in any way give them a new tool to stop the carnage of the next so called sleeper cell men who are left to their own devices and fall for the tropes of idealogues or their own madness? 

I am certainly not trying to defend these bombers. Once they became so detatched from humanity that they built the first pressure cooker bomb I suspect they were already lost forever. I am saying that transparent background checks for everybody already show orders of magnitude more promise, based on those rare occasions when given a chance to work for only a day or two, than what we have now. And I’m saying that everyone already does get background checks. The change I’m talking about is an explicit ongoing background check with complete transparency and full accountability of who and how each check is done. The government could at last get to, in fact would have to stop pretending they do not invade the privacy of citizens and other residents inappropriately.

Think of it! The development and recurring revenue potential for universal background checks for EVERYBODY alone may be among the most impactful governmental concessionaire’s opportunities of all time! Not that I advocate for a Big Brother state. I just think we ought to use the data already being collected and aggregated – and so already mandated by CALEA to be available for the pleasure and [mis]use of law enforcement to also benefit those the data is about and be available to others with a declared right to see that data. Who better than those already playing by CALEA rules to lead the charge? I want essentially for the access and knowledge already expended by the agencies and the corporations they do business with about my background to be extended also to me and to allow me to extend it to others I have determined have a good faith reason to peer into my background  And of course, I want to know who looks at my background. I am certain that the right way to ‘enforce’ background checks is primarily through trans-personal counseling not janitorial police work. If it all happens before a crime is committed, and it can, why even involve the police? For that matter why pretend that counseling a person to help them avoid trouble is equal in any way to criminal enforcement actions. If it is, we either aren’t using the right algorithm or aren’t applying the algorithm correctly.

That is as close as I will get here to an admission that the first years of checking everyone’s background could be bumpy. We are not that good with defining far less complex algorithms than this one would need to be. As so many technicians like to note, “It’s an iterative process.”

Maybe all we have to do is turn caring about others into a guaranteed revenue stream and crapitalism will magically protect us from homegrown terrorism? I admit though, there is $omething about that idea that does not fill me with hope but I can’t quite lay my hands on enough of it…

Seriously, the time to make contact with a person that has done something – or enough somethings – that might disallow them to buy a gun or bullets or a bump stock for their AK or any other weapon part is at the moment they have shown behaviors mandating that they should no longer be allowed to do such things. Waiting until they have an intention to act so want to buy a gun to tell them they cannot buy the gun does not solve any problem before creating another, potentially even more contentious problem. And leaving a backdoor for them to buy a gun from a private seller or gun show without any background check requirement is at least inane, perhaps insane.

I would rather see law enforcement equipped with the truth as born by my background – and also see them as required to be living in community as one of those they police before they are offered the job btw – than the aloof militarized militias in pricey sunglasses we now see so coldly clubbing, gassing, shooting, zapping and spraying large throngs of unarmed mostly non-violent, most often youthful protesters and – disproportionately to the demographics – about half non-whites – who gather in protest of outrage at the many and blatant injustices of our time.

I also note that the cops have a mighty impressive kill rate for old men supposedly always “angry old men” and typically “barricaded” in their own homes, so I keep a low profile. (How often does that mean he shut the door and locked us out so we shot him?)

Perhaps I am frustrated to see these wastefully over-funded, highly secretive, inexplicably militarized agencies only ever able to arrive on the scene after the evil ones among us have acted with such terrible consequence to so many innocents. Seems like gross overkill for that sort of a mop-up operation. A bull dozer at the back of the parade. But damn! don’t those police troups in baggy camouflaged outfits packing long rifles that could come a knocking on your front door one day soon look so very badass from the right TV angle.

Posted in Privacy | Leave a comment

Tails from a Diskless Hyper-V

The Amnesic Incognito Live System (Tails) is open source privacy software. Tails “helps you to use the Internet anonymously almost anywhere you go and on any computer but leave no trace…”. This post explores how a Windows private cloud or data center might leverage Tails to harden defense in depth.

Tails is a Debian Linux .iso rootkit – er I mean boot image – configured to enable peer-to-peer encryption of e-mail messages and IM messages plus The Onion Router’s (Tor) anonymous SSL web browsing. Tor’s add-ins set [java]scripting and cookies off for all web sites by default, although the user can elect to allow scripts or cookies on a per site basis. The recommended way to use Tails is to burn a verified download of the .iso on to a write-once DVD and then use that DVD as the boot device to start the computer. Tails is mostly designed and configured to leave no trace in that scenario and to assure that once the verified image is laid down on a DVD it cannot be changed. 

One limitation of this preferred scenario is that you need to reboot the machine a couple of times each time you use Tails. Once to boot and once to remove the footprint left behind in memory.

Another limitation is that a DVD drive may not be readily available or accessible when needed. Tails developers suggest an almost as secure USB alternative to the DVD, but caution that an ability to surreptitiously modify the kernel is introduced. Tails also allows the user to manually configure local storage, opening a potential security hole. Local storage is needed, for example to load cryptographic keys for the secure OTR IM and PGP email messaging apps included for peer to peer privacy. Tails does automajically configure a piece of it’s memory as a RAMdisk allowing keys to be introduced without persistence in theory.

Virtualization too, I propose, can remove the reboot overhead, however the Tails documentation cautions against running Tails as a virtual machine (VM). “The main issue,” they say, “is if the host operating system is compromised with a software keylogger or other malware.” There simply is no facility for the VM to be sure no such spyware exists on the host. The usage I am suggesting below is the inverse of that trust model. Here we will use Tails to isolate the trusted host’s Windows domain from the Internet leveraging virtualization to help preserve the integrity of the trusted node. From a practical stand point, a better rule of thumb – though still in line with the cautious Tails statement on virtualization – may be to trust a virtual environment only to the extent you trust the underlying host environment(s) that support the virtual machine.

Nov 8, 2015 note – Unfortunately, the growing concerns that Tor is compromised are legitimate:

          https://invisibler.com/tor-compromised/

http://www.idigitaltimes.com/best-alternatives-tor-12-programs-use-nsa-hackers-compromised-tor-project-376976

Also, for virtualization other than Hyper-V see this information about Tails and virtual machine security at boum.org:

   https://tails.boum.org/doc/advanced_topics/virtualization/index.en.html 

A Windows Domain in a physically secured data center implies that the Domain and the data center Ops and admin staff are trusted. But when you open ports, especially 80/443, into that Domain that trust is at increased risk. Given Hyper-V administrator rights on a Windows 2012 Server – but not while logged in with administrative system rights on the server, using Tails from a virtual machine might just be a safer, more secure and self maintaining usability enhancement for a Windows-centric data center or private cloud. 

  • Tails can eliminate many requirements that expose the Windows Domain to the Internet. Internet risks are sandbox-ed on the Linux VM. The Linux instance has no rights or access in the Domain. The Domain has no rights or access to the Linux instance other than via Hyper-V Manager. Most interestingly, Tails boots to a Virtual Machine that has no disk space allocated (other than the RAM disk already mentioned).    
  • Tails will thwart most external traffic analysis efforts by competitors and adversaries. DPI sniffers and pen register access in the outside world will only expose the fact that you have traversed the Internet via SSL traffic to the Tor Network. SSL will prevent most snooping between the VM and the onion servers. No more than a handful of governments – and a few other cartels with adequate processing power – will even have the ability to backdoor through the Certificate Authority or brute force the SSL to actually see where you are going on the Internet.     
  • The Tails developer’s take care of the security updates and other maintenance. To upgrade or patch when used in the read-only diskless Hyper-V configuration, all you need do is download the latest image file. 

Some organizations may be resistant to this idea because Tails will also allow employees to privately and anonymously communicate with the outside world while at work. True enough, the broadcast pen register from a TOR packet will simply not provide adequate packet inspect-able forensic surveillance detail to know what data center employees are up to. That alone could put the kibosh on Tails from a Diskless Hyper-V.

Organizational fear of employees not withstanding, Tails in a Windows data-center presents a robust security profile with excellent usability for those times when the knowledge available on the Internet is urgently needed to help solve a problem or help understand a configuration. I would discourage efforts to configure a backdoor to monitor actual TAILS usage from the host simply because once that back door is opened anybody can walk through. Digital back doors swing both ways: better to put your monitoring energy into making sure there is no back door.

Tails is easy to deploy as a Hyper-V VM on Windows Server 2012 (or Windows 8 Pro with the Hyper-V client):

  • download and verify the file from https://tails.boum.org. No need to burn a DVD. Hyper-V will use the .iso file, although a DVD would work too if that is preferred and will undeniably help to assure the integrity of the image. A shared copy of the iso can be used across an environment. It is necessary to ensure that the VM host computer’s management account and the user account attempting to start the VM have full access to the file share and/or file system folder of the image . 
  • add a new Virtual Machine in Hyper-V Manager.TailsVM
  1. Give the VM 512MB of memory (dynamic works as well as static)
  2. Set the BIOS boot order to start with “CD”
  3. Set the .iso file – or physical DVD drive if that option is used – as the VM DVD Drive.
  4. Configure the VM with a virtual network adapter that can get to the Internet.

May 5, 2014 note – I had to enable MAC spoofing in HyperV for the Internet Network Adapter when I use the newly released tails version 1. The checkbox is located on the Advanced Features of the Network Adapter of the VM. You will not find the Advanced Features option when accessing the Virtual Switch.  It is a setting of the Network Adapter assigned to the tails VM. I suppose another option would be to remove the MAC address hardwired into tails’ “Auto eth0” but also would reduce your anonymity. It works this way but that is all the testing I did on it! Use the hardwired MAC if possible. 

  • Start the VM and specify a password for root when prompted. You will need to recreate the root password each time you start the VM in the diskless VM configuration. It can be a different password for each re-start. Your call though you should still use a strong password (e.g. suitable hardened to meet local password policies) and change it according to policy. The degree of protection of the local Domain from the Internet is dependent upon the security of this password. You never need to know that password again after you type it twice in succession and you don’t want anyone else to know it either… ever.
  • Use the Internet privately and anonymously from your shiny new diskless Virtual Machine.TailsVMDesktop

Iceweasel can browse the Internet just fine in the diskless configuration. Using PGP or OTR, however, both require persisted certificates. That requires disk storage. Instant Messenger and pop email using the tools in Tails won’t happen unless persistent certificates are available. There are probably a number of ways certificate availability can be realized, e.g., RamDisk, network, fob, etc.

A Hyper-V Administrator cannot be prevented from configuring storage inside the Virtual Machine if storage is available to the Hyper-V. Hint: A Hyper-V Administrator can be prevented from configuring storage inside the Virtual Machine if no storage is available to the Hyper-V Administrator.

Not a total solution, but gives a very clean ability to jump on the Internet without exposing the domain to the Internet when needed.

Posted in Privacy, Secure Data | 7 Comments

For Privacy Open the Source & Close the Back Door

There is no surprise in the many recent corporate self-admissions that they too have given our private information. After all, they got us to release our privacy to their care with barely a flick and a click. As a direct consequence – and without need of oversight through lawful warrant or subpoena – Internet service providers (ISP) and tele-communications service providers are compelled to release our pen registers, profiles, email and stored files to host location authorities (e.g., local, state and federal agencies) everywhere in the world when requested. The corporations can, will and have freely, willing and routinely provided our private data, stored on their servers or clouds, upon request. And any will decipher our encrypted private data to assist such surveillance if they can. It is all done with our expressed permission.

A 2012 study I read about in The Atlantic estimates that we each would have to spend about 200 hours a year (that is 78 work days with a calculated cost of $781 Billion to GDP) to actually read all the privacy policies we have accepted. At the same time, the word count in privacy policies is going up, further reducing the likelihood that they will be read and understood. In my opinion, the so purposed design of privacy policies – to make it so easiest to accept without reading – demonstrates the Internet’s ability to coerce the user into acceptance.

“Its OK, I have nothing to hide,” you might be thinking. And to that, “It won’t hurt a thing,” is often added to those same fallacious rationalizations. That sort of thinking is continuously exposed for what it is by the stinky announcements that gigantic globs of our personally identifiable information (PII) stored on corporate servers has been leaked to the bad guys through massive and mysterious spigots lurking in some company’s data. The leaks signal the reminder that government mandated surveillance back doors in the data center (DC) and central office (CO) architectures help provide the weakened security upon which Internet hackers rely.

Thanks to the server back doors, criminals and marketers enjoy the same back door transparency without accountability as do government agents or anyone else that somehow has access through the back door. Truth be told, marketers have better back door access than government agencies in many cases. This is generally the case when you deal with any free service or web site that boasts they “do not save your data”. What they usually do is mine it as come through and distribute some part directly to a third, fourth, fifth, etc. party for harvest. Unauthorized outsiders and criminals often rely upon masquerading as an administrator, marketer or possibly a government agent at the back door.

So it is.

Back doors of any stripe undermine security. Exploiting server back doors is a common objective of marketers, sellers, executives, governments, employees, hackers, crackers, spies, cheats, crooks and criminals alike. The attraction is that there is no way for you to tell who is standing at the back door or who has ever accessed your PII data at the server. While intrusion detection and logging practices have improved over time, it lags in uptake of state-of-the-art technologies. At the same time, the talents of intruders have not only kept pace with but often are defining the state-of-the-art.

Computing back-doors are not a new phenomenon. We could by now be raising our children to fear root kits as if by instinct. Root kits are just back door knobs.

Cookies? Trojans? Worms? Other so-called malware – especially when the malware can somehow communicate with the outside world. It all fits out the back door. SQL Injection? Cross-site scripting? Man-in-the-middle attacks? Key-loggers? Just back doorways.

I need to take it one step further though. To a place where developers and administrators begin to get uncomfortable. Scripting languages (PowerShell, c-shell, CL, T-SQL, VBA, javascript, and on and on and on) combined with elevated administrative authority? All free swinging back doors.

That’s right! Today’s central offices, data centers, and by extension cloud storage services – are severely and intentionally weakened at their very foundation by mandated back doors that have been tightly coupled to the infrastructure for dubious reasons of exploitation. That’s nuts!

Whats worse? We the people – as consumers and citizens – pay the costs to maintain the very electronic back doors that allow all comers to effortlessly rob us of our earnings, identities and privacy. What suckers!

And we provide the most generous financial rewards in society to the executives – and their politicians – that champion the continuation of senselessly high risk configurations that burp out our private information to all comers. That’s dumb.

~~~~~

So, how did we get here? It started way before the PATRIOT Act or September 11, 2001. The process has served to advantage governments and – in exchange for cooperation – businesses with little transparent deliberation and much political bi-partisanship. Both corporate and political access without accountability to user PII has been serviced at the switch in Signal System 7 for as long as there have been such switches and at the server for as long as there have been servers.

To wit, Mssr. A. G. Bell, and Dr. Watson – I presume, incorporated AT&T in 1885.

To implicate contemporary corporate data stewards, all one need do is look at the explosion in so called “business intelligence” spending to see user data in use in ways that do not serve the interests of or, in any other way, benefit the users. Most often the purpose is to aid others to make more money. I leave it to you to decide how other’s might profit from your data.

Some act without any degree of ethical mooring. There is a driven interest, by most corporations that can afford the up front infrastructure costs, to use all the data at their disposal in every way imaginable in the quest to lift the bottom line. It is done regardless if it is a people harming virtue of capitalism or not. The only thing that matters to a Corporation is profit. I mean, who would ever sell cigarettes using advertising filled with sexy beach scenes and handsomely rugged cowboys but knowingly forget to mention that smoking cigarettes is one of the worst things you could ever do to yourself? This intention to mine your data for behavioral advertising purposes is one of the topics you could have read a few words about deep under that “I have read” button you magically uncovered and thoughtlessly clicked through when presented the chance to read those pesky privacy policies first. To late now…

The legislation and adjudication in opposition to government mandated communication back doors in the US can be followed back to the bootleggers during Prohibition. In 1928 the Taft Supreme Court (Hoover was the President) decided (5-4) that obtaining evidence for the apprehension and prosecution of suspects by tapping a telephone is not a violation of a suspects rights under the 4th or 5th Amendments to the US Constitution.

The Communications Act of 1934 (Roosevelt) granted oversight of consumer privacy to the newly created Federal Communications Commission (FCC).

Beginning in the 1960’s, with no real concern evident among the people, television revelations began weekly broadcasts showing how Opie’s Pop, Sheriff Andy, could listen in on your phone calls or find out who you had talked with and what you had said in past phone conversations. All he had to do was ask Sarah at the phone company.

Alas, in 1967 the Warren Supreme Court (Johnson) overruled the 1928 decision (7-1) and said the 4th Amendment does in fact entitle the individual to a “reasonable expectation of privacy.” This was widely thought to mean government agents had to obtain a search warrant before listening in on a phone conversation. However, the erosion of privacy at the confluence of surveillance and profit has since become a muddy delta.

All privacy protection during “any transfer of signs, signals, writing, images, sounds, data, or intelligence of any nature transmitted in whole or in part by a wire, radio, electromagnetic, photo-electronic or photo-optical system that affects interstate or foreign commerce” were revoked in the US – in a bi-partisan fashion – as the Electronic Communications Privacy Act (ECPA) of 1986 (Reagan). ECPA effectively expanded the reach of the Foreign Intelligence Surveillance Act (FISA) of 1978 (Carter) to include US Citizens: heretofore protected by the Bill of Rights from being spied upon by the US government.

No one I know had an email address in 1986. So no one cared that ECPA stripped American citizens of their email privacy. No one I know does not have an email address in 2013 (Update April 1, 2017: free email is now on life support and about to die – an abortion would have been so much better for everyone). Still, few seem alarmed that there has been no electronic privacy in the US since 1986. Judging by the popularity of the Internet-as-it-is and in the light of the unrelenting and truly awful stories of hacking resulting in travesties from identity theft to stalking to subversion of democracy coming to the fore every day, perhaps nobody even cares?

But it continues to get worse for you and I. With the Communications Assistance for Law Enforcement Act (CALEA) of 1994 (Clinton), the full burden of the costs to provision and maintain an expanded ECPA surveillance capability was thrust upon the service provider. I leave it, again, to you to decide how service providers funded the levee (hint: profits are up). Beginning explicitly with CALEA, providers are now required to build data centers – and System 7 COs, cellular networks, SMS, etc. – with a guaranteed and user friendly listening ability for surveillance agents working under ECPA authority: the free swinging back door became a government mandate.

The Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism Act (USA PATRIOT ACT) of 2001 (Bush 2) removed any need to notify an individual that they had been under surveillance until and unless authorities arrest and charge that individual. The burden of electronic privacy was placed squarely on the individual. Privacy officially died. Not that things really changed all that much.

Even now agencies play games under the cover of the USA PATRIOT ACT by charging US non-citizens and holding and torturing them as desired in indefinite detention in offshore facilities sometimes, perhaps in part to avoid having to disclose methods should some matter ever come to trial? I have no way to know exactly what they are doing, but the pattern of escalating surveillance permissiveness in legislation combined with the steady leaking of heinous truths over time suggest that it is only a matter of time before the ability to hold citizens without charge becomes an effective sledge hammer methodology for agencies and, then, the local police. History is quite clear that such detainment will be used and will be used inappropriately.

Still the politicians remained unsatisfied? In 2008, FISA was amended to effectively eliminate the distinction in agency surveillance of an enemy combatant and a citizen. Now, indeed, everyone, citizen and non-citizen alike, is ‘the enemy’ through the FISA visor. FISA Amendment changes continue to ripple through ECPA, CALEA and US PATRIOT ACT regulations in an expansion of authority to a force already claimed by its bureaucratic leadership to be stretched too thin to be able to keep track of what they are doing accompanied by a decrease in already inadequate and faltering judicial oversight now made less transparent and less accountable than is necessary for an effective and democratic “rule of law”.

In 2006 and then again in 2011 the US PATRIOT ACT regulations that were supposed to expire because they would make the country safe enough in a limited time to not be needed in the future were extended… and re-extended.

Recently the NSA claimed it would violate our privacy if they secretly told even to two US Senators authorized for NSA oversight approximately how many citizens they had electronically spied on. Why is that not disturbing to most? It is worth noting that the Generals of the NSA – yes, the US Military call the shots on the privacy for all American Citizens – made it clear at that time that perhaps no-one has a way to tell who has and has not been electronically spied upon – as an alternative way to explain why they could not answer the Senators’ question.

It might be OK if privacy had been fairly traded for security, but that has not happened. Instead, the government has given our privacy to these unaccountable agencies and the terrorism continues. The police and other agencies are arriving only in time to clean up the mess, spending shit loads of the public’s money putting on a good show for the cameras, and spinning the truth about how much these laws and this ‘enforcement’ are helping. They may be getting better at stopping the second terror attack of a particular stripe, but that is only valuable to society when the bad guys repeat a type of attack. So far, that is not happening. The agencies are being pwned big time and don’t even notice because they are too busy reading our email.

The 4th Amendment is, for all intents and purposes, null and void – unless you have a bigger gun. The 9th Amendment is now about exceptions and exclusion to rights instead of the protection of rights not named elsewhere in the Bill of Rights as the unchanged text of the amendment would suggest. If I understand correctly, even the 1st Amendment has been struck. I’m not a constitutional expert, but I am 100% positive privacy is out the window like a baby toy and now we are now too far down that road to even think about going back to find it.

Our government is now self-empowered to spy on the people, self-evidently convinced it must spy on the people and self-authorized to exterminate its own citizens without the process of law we are told is due every citizen. This is territory most inconsistent with the Constitution of the United States as I understand it and wholly unacceptable to the vast majority of the citizenry with knowledge of the matter as far as I can tell. Indeed, what people on earth should tolerate such governance?

 

Update August 22, 2015. The USA FREEDOM Act of 2015 (Obama) stirs the muddy waters of privacy but in the end is little more than a re-branding effort that hopes to squelch the post-Snowden outcry against mass surveillance. And with the passing of the FREEDOM Act, the push against cryptography by the agencies has been redoubled.

~~~~~   

So, what can be done? Here are some guiding principles for anyone seeking to take back their online privacy. It ain’t pretty:

  1. There is no plan to delete anything. Never write, type, post or say anything [on-line] you do not want others to see, read, overhear or attribute to you. Anything you put on the Internet just may be out there forever. IBM has boasted the ability to store a bit of data in 12 atoms. Quantum data storage is just around corner. MIT suggests that Quantum computing (@ 2 bits per atom) will be in Best Buy by 2020. And search technology is making orders of magnitude larger strides than storage technology.
  2. You cannot take anything back. Accept that all the information you may have placed online at any time – and all so called ‘pen registers’ that document your interactions during placement – does not belong to you. Sadly, you may never know the extent of compromise this not-yours-but-about-you data represents until it is too late to matter. The single most important action you can take to safeguard what little is left of your privacy – from this moment forward – is to use only peer reviewed Open Source privacy enabled software when connected to the Internet and to deal only with that respect your privacy. But where are those capitalist?
  3. Stop using social web sites. There are many ways to keep track of other peoples’ birthdays. There is not much worth saying that can be properly said in one sentence or phrase and understood by boodles of others. Makes for good circus and gives people something to do when there is nothing appealing on TV but not good for communication or privacy. But combine the keywords from your clucks, demographics from your birthday reminder list and your browsing history and it is far more likely that you can be effectively ‘advertised’ into a purchase you had not planned or researched the way you likely had claimed you always do. Such behavior inducing advertising, in essence cheapens life while it makes a few people a lot of money.
  4. Avoid web sites that know who you are. Search engines and portals, like all free-to-use web sites, get their money by looking through donations and fundraising else through the back door by keeping and reselling the history. Maybe forever? This data is not generally encrypted, nor even considered your data (oops, there goes that pesky Privacy Policy again).  Nonetheless, anyone that can hack into this not-your-data has the information needed to recreate your search history and, in all likely hood, to identify you if so desired. Corporate data aggregations and archives, so-called data warehouses – often leave related data available for business analysts, developers, network engineers, and any sneaks who might find a way to impersonate those behind the scenes insiders, through a nicely prepared user interface that can drill-down from the highest aggregations (e.g. annual corporate sales or population census data) to the actions and details of an individual in a few clicks. Once ordered, organized, indexed and massaged by high powered computers, this data remains ready for quick searching and available in perpetuity.  Protect your browsing history and searches from as much analysis as possible – a favorite pen register classed surveillance freebie for governments (foreign & domestic), marketers, and criminals alike. One slightly brutal way might be to surf only from freely accessible public terminals and never sign-in to an online account while surfing from that terminal. An easier and open source but still more work than not caring way may be to hit Tor’s onion servers using FireFox and Orbot from your Android device or the Tor browser bundle from your Linux desktop or thumbdrive. (We have no way to know if the Windows or Mac desktop are backdoor-ed. ). You could even combine the two approaches with tails – assuming you can even find a public kiosk or Internet Cafe that will let you boot to tails. A VPN from home would work well too, if you can be certain the VPN provider holds your interest and privacy above more profits.
  5. Use only open source software that you trust  Avoid all computer use, especially when connected to the Internet, while logged in with administrator or root authority. Particularly avoid connections to the Internet while logged in with the administrator or root credentials. Avoid software that requires a rooted smartphone or a local administrator login during use.
  6. adopt peer-to-peer public key cryptography 
    1. securely and safely exchange public keys In order to have confidence in the integrity of the privacy envelope of your communications and exchanges with others.
    2. exchange only p2p encrypted emails Never store your messages, even if encrypted by you, on a mail server else you forgo your right to privacy by default. I think US law actually says something like when your email is stored on somebody else’s mail server it belongs to that somebody else not to you. Even Outlook would be better, but Thunderbird with the Enigma OpenPGP add-in is a proven option for PGP encryption using any POP account. The hard part will be re-learning to take responsibility for your own email after becoming accustomed to unlimited public storage (and unfettered back door access). It will also become your responsibility to educate your friends and family about the risks to convince them to use peer-to-peer public key cryptography and secure behaviors too. Until then your private communications to those people will continue to leak out no matter what you do to protect yourself.
    3. exchange only p2p encrypted messages For SMS text, investigate TextSecure from Open WisperSystems. I don’t have a suggestion for SMS on the desktop.  For other messaging check out GibberBot that connects you through the Tor network on your Android device. If used by all parties to the chat, this approach will obfuscate some of your pen registers at the DC and all of your message text. Installing Jitsi adds peer-to-peer cryptography to most popular desktop Instant Messaging clients. Jitsi does not close any back doors or other vulnerabilities in IM software. Your pen registers will still be available at the server and attributable to you but your private information will only be exposed as encrypted jibberish. Using the onion servers with Jitsi or GibberBot will help obfuscate your machine specific metadata but the IM server will still know it is your account sending the message. Security experts seem convinced that Apples loudly advertises the iMessage back-door: http://blog.cryptographyengineering.com/2012/08/dear-apple-please-set-imessage-free.html
    4. exchange p2p encrypted files If you get A. right, this will be a breeze.
    5. exchange p2p encrypted SMS messages else avoid SMS.  I had briefly used TextSecure from Open WisperSystems on Android 4x. I don’t have a secure tested Windows or Linux desktop suggestion for SMS.
    6. exchange p2p encrypted voice communications While web phone Session Initiation Protocol (SIP) providers are subject to the same pen and tap logging rules as all other phone technologies. The biggest practical differences between SIP and good old System 7 or cellular switching is the open source software availability and throughput potential. With SIP several open source apps are available now built upon Zimmerman’s Realtime Transport Protocol (ZRPT) for peer-to-peer encryption of SIP-to-SIP multimedia capable conversations. I know Jitsi includes ZRPT by default for all SIP accounts registered. When a call is connected the call is encrypted, but ONLY if the other party to the call is also using a ZRPT peer. tforo
  7. avoid trackers, web bugs and beacon cookies Cookies are tiny files. They are an invaluable enhancement for user experience that don’t get wiped from your machine when you leave the page that dropped the cookie. Cookies have become impossible to manage manually because there are so many and because many cookie bakers try to make it difficult for you to determine the ingredients of their cookies on your machine that fills with your data. That is so creepy. What could go wrong? But tracker cookies are worse than most They keep collecting your data even after you leave the baker’s web site and disconnect from the Internet. Then, every chance they get when you next connect to the Internet, these cookies will gather more data from you and ever so slyly transmit your data out to their death star . Lots of trackers come in or as advertisements, though most are simply invisible to the unaware user. One classic beacon cookie is a picture file with no image, just a tracked data collector, yet done in a way that convinces most (all?) tracker detectors of it’s innocence. However, it would be foolish to characterize trackers as ever built one way or another. The design goal is and will always be to not look like a tracker. In today’s world, I believe it safe say that the tracker builders continue to have an easy time of it toand are one giant step ahead of the tracker trackers. I have AdBlock Plus, Ghostery Disconnect and the EFF’s Privacy Badger running on the desktop browser I am using to edit this old page. AdBlock Plus finds 4 adds but blocking has to be off or I am not able to edit the blog post. Ghostery finds 3 web bugs and Privacy Badger identifies 7 trackers or for this page.  Disconnect finds 27 request for my data from a variety of sources. Privacy Badger sees 16 potential trackers and blocks all but three. The thing is Disconnect has 19 request sorted under a ‘content’ category that are not blocked and if I try blocking any of them, the free WordPress weblog breaks,  Google and twitter both seem to have trackers on me that neither Privacy Badger or Disconnect blocks by default by default on my browser. Could be I made the wrong choice on some google policy 11 years ago, could be I was drunk the other day and clicked on accept so I could get to the porn faster, or could be Google or WordPress imposes this unblocked condition. Could even be that they are benign cookies and my tracker trackers know it, though I mostly doubt any scenario other than they seek to send my data back to the server.

As you can see, effective tools are not really available to protect on-line privacy short of end-to-end encryption. What’s more, the bad guys are already using all the tools to keep them undetected! The challenge for us is a human behavioral issue that ultimately demands little more than awareness of what is happening around your and a willingness to cooperate in a community of other’s in search of privacy. Could be that cooperation alone is the overpowering impediment in these polarized times. Oddly, most find it easier to trust Google and Facebook than to trust the people they know. Only if everyone in the communication values privacy and respects one another enough to move together to a peer-to-peer public key cryptography model using widely accepted and continuously peer reviewed software can that software hope to find a satisfactory digital privacy.

We must start somewhere.

I repeat, the bad guys made the changes long ago so your resistance serves only your demise and the ability of others to profit from your data until that time.

Sadly, I’m not at all sure how to convince anyone that spends time on Facebook, Twitter and that lot not flock toward the loudest bell. You’ll are throwing your privacy to the wolves. With each catastrophe perpetrated by the very bad guys that the rape of our privacy was supposed to protect, the media immediately and loudly lauds the police and other agencies for doing such a great job and proclaims how lucky we are to have them freely spying upon our most personal matters. The agencies, for their part, continue to bungle real case after real case yet maintain crafty bureaucratic spokespeople to pluck a verbal victory from the hind flanks of each shameful defeat of our privacy. Turns out the agencies don’t even use the pen registers and tap access for surveillance they claim to be crucial. Instead it is a helper when sweeping up the mess left behind by the last bad guy that duped them. Why are the agencies not preventing the horrible events as was falsely promised during the effort to legitimize their attacks on our personal privacy?

For genuine privacy, all people in the conversation must confidently and competently employ compatible p2p cryptography. That’s all there is to it. Until folks once again discover the fundamental value in private communications and public official transparency, public accountability is beyond reach.. and your privacy and my privacy will remain dangerously vulnerable.

Posted in Code Review, Privacy, Secure Data | Leave a comment

It’s [Still!] the SQL Injection… Stupid

Did you see Imperva’s October 2012 Hacker Intelligence Report? The report is a data mining study directed toward the on-line forum behaviors among a purportedly representative group of hackers. The milestone for October 2012 is that Imperva now has a year’s worth of data upon which to report. Almost a half a million threads in the mine. In this study, keyword analysis found SQL injection sharing the top of the heap with DDOS in terms of what hackers talk about in the forums. In the wake of Powershell 3.0 – the study also identifies shell code as the big up-and-comer for mentions in the hacker threads. Only 11% of the forum posts mention “brute force”. “Brute force” being the only topical category Imperva charted with a direct relationship to cryptography.

The absence of an aggregate specifically dedicated to cryptography or encryption strongly suggest the hackers are not talking much about cryptography. Hmmmm.

Keyword frequency in 439,587 forum threads:

  1. SQL injection 19%
  2. dDos 19%
  3. shell code 15%
  4. spam 14%
  5. XXS 12%
  6. brute force 11%
  7. HTML injection 9%

The report also cites disturbing data from a 2012 Gartner study, “Worldwide Spending on Security by Technology Segment, Country and Region, 2010-2016”. The Gartner study purportedly finds that less than 5% of money spent on computing security products buys products that are useful against SQL injection.

Holy octopus slippers! The statistics definitely make a good case for taking a look at Imperva’s products. Even if there is some marketing bias in the study – I don’t think so, but saying even if there is – the findings are more bad news for data security. Whats worse is we seem to be headed in the wrong direction.  Consider:

  • The SQL Server 2008 Books Online had a page of SQL injection prevention best practices that is removed from the SQL Server 2012 edition.
  • SQL injection prevention guidance is largely unchanged for many years yet is not widely followed. Input validation is the key. Judging by the hacker interest in SQL injection, adoption of the guidance must be low.
  • Hackmageddon.com’s Cyber Attack Statistics indicate that SQL injection is found in over 25% of the hacker attacks documented during October 2012.
  • Cloud host FireHost – a WMWare base web host with an impressive security claim and a data centers in Phoenix, Dallas, London and Amsterdam –  reported that attacks the Firehost infrastructure had detected and defend saw a 69% spike in SQL injection attacks in second quarter 2012 ( and then cross site scripting surged in just ended Q3).

What is stopping us from truly going after this problem? Input validation is not hard and need not be a bottleneck. Threat detection is supposedly a part of any ACE/PCI/FIPS/HIPPA  compliant system. Detection avoidance is well understood. Nonetheless, and for reason$ far beyond reasonable, the strategic emphasis remains stuck on compliance. That would be OK if the standards of compliance were even close to adequate. Clearly they are not. That proof is in the pudding.

There are SQL injection detection and avoidance products out there that work. Many many products. Just to name a few – and by way of describing the types of tools that forward thinking organizations are already buying in an effort to eradicate SQL injection:

Application Delivery Network (ADN)

  • Citrix NetScaler
  • F5s BigIP
  • Fortinet  FortiGate

Web Application Firewall (WAF)

  • Applicature dotDefender WAF
  • Cisco ACE Web Application Firwewall
  • Imperva Web Application Firewall & Cloud WAF
  • Barracuda Networks Web Application Firewall
  • armorize’s SmartWAF (Web server host based)

Web Vulnerability Scanners (WVS)

  • sqlmap (free)
  • Acunetix WVS

Unified Threat Management (UTM)

  • Checkpoint UTM-1 Total Security Appliance
  • Saphos UTM Web Server Protection
  • Watchguard XTM

These products and the overarching practice changes needed to implement them show success in going after the problem. But, as the Gartner study shows, nobody seems to be buying it.

There are also cloudy platform hosts and ISPs like FireHost that handle the WAF for organizations that cannot justify the capital costs and FTEs required to do the job right in-house due to scale.

Ever so slowly the major hosts are imposing WAF upon all tenants. Another n years at the current snail’s pace and the security improvement just might be noticeable. Seems to me like “do it and and do it now” is the only choice that can reverse a ridiculous situation that has gone on too long already. Even secure hosts prioritize profitability over basic security. That is rent seeking.

Any web presence that takes another tact is telegraphing priorities that violate the duty to protect that which is borrowed under an explicit promise of confidentiality or generally accepted fiduciary performance levels equivalent to all other financial and intellectual assets of that organization. Few meet the sniff test. Many remain highly profitable. Just take a look in your Facebook mirror. Customer’s and consumers have no visibility into how any organization will honor this responsibility nor recourse when that duty is shirked. The metrics above make clear that poor security practices shirk the responsibility and carelessly feed the identity theft racket. It must end. Organizations that perpetuate this status quo and remain profitable are complicit in an ongoing crime of  against humanity. IT staff who are fairly characterized as “team players” or leaders in such organizations are every bit as culpable as the soldiers of Auschwitcz or My Lai or the owner of a piano with Ivory keys.

Organizations private and public have a fundamental obligation to protect customers, clients and citizens from illegal exploitation. What in the world makes it OK to exclude chronic identity theft violations from that responsibility?

Even when the data center budget includes one of the more robust solutions; to have done the needful in terms of basic input validation, code/user authentication and principal-of-least-privilege access rights is essential for any good defense-in-depth security strategy.

Consider this T-SQL function from the Encryption Hierarchy Administration schema that implements a passphrase hardness test based upon the SQL injection prevention guidance in the SQL 2008 BOL.

1   -------------------------------------------------------------------------------
2   --    bwunder at yahoo dot com
3   --    Desc: password/passphrase gauntlet
4   --    phrases are frequently used in dynamic SQL so SQL Injection is risk
5   -------------------------------------------------------------------------------
6   CREATE FUNCTION $(EHA_SCHEMA).CheckPhrase 
7     ( @tvp AS NAMEVALUETYPE READONLY )
8   RETURNS @metatvp TABLE 
9     ( Status NVARCHAR (36)
10    , Signature VARBINARY (128) )
11  $(WITH_OPTIONS)
12  AS
13  BEGIN
14    DECLARE @Status NVARCHAR (36)
15          , @Name NVARCHAR(448)
16          , @UpValue NVARCHAR (128) 
17          , @Value NVARCHAR (128) ;
18    -- dft password policy as described in 2008R2 BOL + SQL Injection black list
19    -- fyi: SELECT CAST(NEWID() AS VARCHAR(128)) returns a valid password 
20    SET @Status = 'authenticity';
21    IF EXISTS ( SELECT *
22                FROM sys.certificates c
23                JOIN sys.crypt_properties cp
24                ON c.thumbprint = cp.thumbprint
25                CROSS JOIN sys.database_role_members r
26                WHERE r.role_principal_id = DATABASE_PRINCIPAL_ID ( '$(SPOKE_ADMIN_ROLE)' ) 
27                AND r.member_principal_id = DATABASE_PRINCIPAL_ID ( ORIGINAL_LOGIN() )  
28                AND c.name = '$(OBJECT_CERTIFICATE)'
29              AND c.pvt_key_encryption_type = 'PW'
30                AND cp.major_id = @@PROCID 
31                AND @@NESTLEVEL > 1 -- no direct exec of function 
32                AND IS_OBJECTSIGNED('OBJECT', @@PROCID, 'CERTIFICATE', c.thumbprint) = 1
33                AND EXISTS ( SELECT * FROM sys.database_role_members 
34                              WHERE [role_principal_id] = USER_ID('$(SPOKE_ADMIN_ROLE)')
35                              AND USER_NAME ([member_principal_id]) = SYSTEM_USER 
36                              AND SYSTEM_USER = ORIGINAL_LOGIN() ) )        
37      BEGIN
38        SET @Status = 'decode';
39        SET @Name = ( SELECT DECRYPTBYKEY( Name 
40                                         , 1
41                                         , CAST( KEY_GUID('$(SESSION_SYMMETRIC_KEY)') AS NVARCHAR (36) ) ) 
42        FROM @tvp );
43        SET @Value = ( SELECT DECRYPTBYKEY( Value, 1, @Name ) FROM @tvp );                    
44        IF PATINDEX('%.CONFIG', UPPER(@Name) )  -- no strength test, will fall through 
45         + PATINDEX('%.IDENTITY', UPPER(@Name) )             
46         + PATINDEX('%.PRIVATE', UPPER(@Name) ) 
47         + PATINDEX('%.SALT', UPPER(@Name) )           
48         + PATINDEX('%.SOURCE', UPPER(@Name) ) > 0       
49          SET @Status = 'OK';
50        ELSE
51          BEGIN
52            SET @UpValue = UPPER(@Value);
53            SET @Status = 'strength';
54            IF ( (    ( LEN(@Value) >= $(MIN_PHRASE_LENGTH) )   -- more is better
55                  AND ( PATINDEX('%[#,.;:]%'
56                      , @Value ) = 0 )   -- none of these symbols as recommended in BOL 
57                  AND ( SELECT CASE WHEN PATINDEX('%[A-Z]%'
58                                                  , @Value) > 0 
59                                    THEN 1 ELSE 0 END    -- has uppercase
60                              + CASE WHEN PATINDEX('%[a-z]%'
61                                                  , @Value) > 0 
62                                    THEN 1 ELSE 0 END    -- has lowercase  
63                              + CASE WHEN PATINDEX('%[0-9]%'
64                                                  , @Value) > 0 
65                                  THEN 1 ELSE 0 END    -- has number
66                              + CASE WHEN PATINDEX('%^[A-Z], ^[a-z], ^[0-9]%' -- has special
67                                                  , REPLACE( @Value,SPACE(1),'' ) ) 
68                       ) > 0  
69                                    THEN 1 ELSE 0 END ) > 2 ) )   -- at least 3 of 4
70              BEGIN 
71                -- black list is not so strong but can look for the obvious 
72                SET @Status = 'injection';                       
73                IF ( PATINDEX('%[__"'']%', @UpValue)   -- underscore (so no sp_ or xp_) or quotes
74                   + PATINDEX('%DROP%'   , @UpValue)   -- multi-character commands... 
75                   + PATINDEX('%ADD%'    , @UpValue)
76                   + PATINDEX('%CREATE%' , @UpValue)
77                   + PATINDEX('%SELECT%' , @UpValue)
78                   + PATINDEX('%INSERT%' , @UpValue)
79                   + PATINDEX('%UPDATE%' , @UpValue)
80                   + PATINDEX('%DELETE%' , @UpValue)
81                   + PATINDEX('%GRANT%'  , @UpValue)
82                   + PATINDEX('%REVOKE%' , @UpValue)
83                   + PATINDEX('%RUNAS%'  , @UpValue)
84                   + PATINDEX('%ALTER%'  , @UpValue)
85                   + PATINDEX('%EXEC%'   , @UpValue)
86                   + PATINDEX('%--%'     , @Value)     -- comments...
87                   + PATINDEX('%**/%'    , @Value) 
88                   + PATINDEX('%/**%'    , @Value)  = 0 )
89                  BEGIN 
90                    SET @Status = 'duplicate';
91                    IF NOT EXISTS ( SELECT *                  -- not already used  
92                                    FROM $(EHA_SCHEMA).$(NAMEVALUES_TABLE) n
93                                    WHERE ValueBucket = $(EHA_SCHEMA).AddSalt( '$(SPOKE_DATABASE)'
94                                                                              , '$(EHA_SCHEMA)'
95                                                                              , '$(NAMEVALUES_TABLE)'
96                                                                              , 'ValueBucket' 
97                                                                              , @Value)
98                                    AND CAST(DecryptByKey( n.Value -- should be rare
99                                                          , 1
100                                                         , @Name ) AS NVARCHAR (128) )  =  @Value )  
101                     SET @Status = 'OK';
102                 END
103             END
104          END
105     END
106   INSERT @metatvp
107     ( Status
108     , Signature ) 
109   VALUES 
110     ( @Status
111    , SignByCert( CERT_ID('$(AUTHENTICITY_CERTIFICATE)'), @Status ) );
112   RETURN;
113 END
114 GO
115 ADD SIGNATURE TO $(EHA_SCHEMA).CheckPhrase 
116 BY CERTIFICATE $(OBJECT_CERTIFICATE)
117 WITH PASSWORD = '$(OBJECT_CERTIFICATE_ENCRYPTION_PHRASE)';
118 GO

SQL injection input validation is only part of what goes on here. The function accepts an already encrypted name value pair TVP as a parameter and returns a signed business rule validation result as a TVP.  To do so, first the schema and user authenticity are verified before the phrase is decoded and the SQL injection/detection rules are applied. Only if all rules are met will an IO be required to verify that the phrase has not already been used.

The bi-directional encoding of parameters with a private session scoped symmetric key helps to narrow the SQL injection threat vector even before the filter(s) can be applied. This means that the passed values have already successfully been used in a T-SQL ENCRYPTBYKEY command in the current database session. Not that encryption does anything to prevent or detect SQL injection. It is more that the first touch of any user input value carries higher risk. Likewise the first use of an input in any dynamic SQL  statement carries a higher risk. Always better to do something benign with user input before you risk rubbing it against your data.

In the process of validation, two black lists are used to filter punctuation (line 55) and specific character sequences (lines 73-88) frequently identified as injection markers.

Another function from the schema validates names for T-SQL Encryption Hierarchy key export files. In this function the black list filter that includes file system specific markers as identified in the same SQL Server 2008 R2 books Online article. The somewhat cumbersome PATINDEX() driven exclusion filter pattern is used in the file name function as is used for hardness testing.

1   -------------------------------------------------------------------------------
2   --    bwunder at yahoo dot com
3   --    Desc: apply file naming rules and conventions
4   --    name not already in use and no identified sql injection
5   -------------------------------------------------------------------------------
6   CREATE FUNCTION $(EHA_SCHEMA).CheckFile 
7     ( @Name VARBINARY (8000) )
8   RETURNS BIT
9   $(WITH_OPTIONS)
10  AS
11  BEGIN
12    RETURN (SELECT CASE WHEN  PATINDEX('%[#,.;:"'']%', Name) 
13                            + PATINDEX('%--%', Name)
14                            + PATINDEX('%*/%', Name)
15                            + PATINDEX('%/*%', Name)
16                            + PATINDEX('%DROP%', Name)
17                            + PATINDEX('%CREATE%', Name)
18                            + PATINDEX('%SELECT%', Name)
19                            + PATINDEX('%INSERT%', Name)
20                            + PATINDEX('%UPDATE%', Name)
21                            + PATINDEX('%DELETE%', Name)
22                            + PATINDEX('%GRANT%', Name)
23                            + PATINDEX('%ALTER%', Name) 
24                            + PATINDEX('%AUX%', Name) 
25                            + PATINDEX('%CLOCK$%', Name) 
26                            + PATINDEX('%COM[1-8]%', Name)
27                            + PATINDEX('%CON%', Name) 
28                            + PATINDEX('%LPT[1-8]%', Name) 
29                            + PATINDEX('%NUL%', Name) 
30                            + PATINDEX('%PRN%', Name) = 0
31                        AND NOT EXISTS 
32                ( SELECT COUNT(*) AS [Existing] 
33                  FROM $(EHA_SCHEMA).$(BACKUP_ACTIVITY_TABLE)
34                  WHERE BackupNameBucket = $(EHA_SCHEMA).AddSalt( '$(SPOKE_DATABASE)'
35                                                                , '$(EHA_SCHEMA)'
36                                                                , '$(BACKUP_ACTIVITY_TABLE)'
37                                                                , 'BackupNameBucket' 
38                                                                , Name ) )    
39                        THEN 1 ELSE 0 END
40            FROM (SELECT CAST( DECRYPTBYKEY ( @Name ) AS NVARCHAR(448) ) AS Name  
42                  FROM sys.certificates c
42                  JOIN sys.crypt_properties cp
43                  ON c.thumbprint = cp.thumbprint
44                  CROSS JOIN sys.database_role_members r
45                  WHERE r.role_principal_id = DATABASE_PRINCIPAL_ID ( '$(SPOKE_ADMIN_ROLE)' ) 
46                  AND r.member_principal_id = DATABASE_PRINCIPAL_ID ( ORIGINAL_LOGIN() )  
47                  AND c.name = '$(OBJECT_CERTIFICATE)'
48                  AND c.pvt_key_encryption_type = 'PW'
49                  AND cp.major_id = @@PROCID 
50                  AND @@NESTLEVEL > 1 
51                  AND IS_OBJECTSIGNED('OBJECT', @@PROCID, 'CERTIFICATE', c.thumbprint) = 1
52                  AND EXISTS (SELECT * FROM sys.database_role_members 
53                              WHERE [role_principal_id] = USER_ID('$(SPOKE_ADMIN_ROLE)')
54                              AND USER_NAME ([member_principal_id]) = SYSTEM_USER 
55                              AND SYSTEM_USER = ORIGINAL_LOGIN() ) ) AS derived );
56  END
57  GO
58  ADD SIGNATURE TO $(EHA_SCHEMA).CheckFile 
59  BY CERTIFICATE $(OBJECT_CERTIFICATE)
60  WITH PASSWORD = '$(OBJECT_CERTIFICATE_ENCRYPTION_PHRASE)';
61  GO

When processing happens one string at a time this filter is of small performance concern. However, processing a large set against such a filter could be slow and disruptive. The objects are obfuscated into the database WITH ENCRYPTION so only those with elevated access, development environment access and source repository access are likely to be aware of the filter details. Most of the logic in the functions is to verify the authority and authenticity of the caller and the calling object.

These functions demonstrate that fundamental SQL injection protection is easily achievable even for an application with the crippled regular expression support of T-SQL. If performance or load is a service level issue, the CLR might be a better host for the functions. However, as with obfuscation, the most important place to validate against SQL injection is at the point where data enters the system. In some cases SQL injection protection done after the invalidated text moves inside SQL Server will be too late. Only when the user interface is a SQL command line could it be the best security choice to validate against SQL injection inside SQL Server. In scope, SQL injection prevention is an application layer exercise. That being said, skipping the SQL injection validation inside SQL Server is reckless. Robust security will employ layers of defense.

In my mind, the command line too must always be considered as an attack vector even if not a favorite SQL injection attack vector at this time and even if the application makes no use of the command line. As the metamorphosis of now soluble security landscape unfolds, the hackers will be shining their cyber-flashlights everywhere and chasing anything shiny. To discount that the command line will continue to get a good share of malicious attention as long as there are command lines and hackers is negligence and/or denial.

For the Encryption Hierarchy Administration schema that uses the functions above a Powershell deployment and administration interface is helpful to improve security. With a pure T-SQL approach there is always a risk of exposure of user input text of secrets in memory buffers before the input value can be ciphered by the database engine. Granted, it is a brief period of time and I am not even sure how one would go about sniffing SQLCMD command line input without running in a debugger or waiting for the input to move into database engine workspace. It surely must be available somewhere in memory. The scenario is a target. I know I have never check if the operating system is running a debugger in any T-SQL script I have ever written. This helps to illustrate why the best time and place to encrypt data is at the time it is generated in the place where it is generated or enters the system. Even then, 100% certainty will remain elusive if the system cannot be verified to be keylogger free.

The utility models a somewhat unusual scenario where encryption at the database is the right choice. Nonetheless, getting the many secrets required for the administration of encryption keys and key backups entered into the system presents a potential for exposure to memory mining hackers. Using SecureString input and SecureString based SMO methods to instantiate the database objects that need the secrets can eliminate much of that vulnerability. As you may know a SecureString is an .NET object .encrypted by a hash from the current user’s session at all times while in memory with user cleanup from memory that can be more secure than garbage collection. It is relatively easy for the user to decrypt the SecureString data on demand but doing so would result in sensitive information becoming available as clear text in memory registers where the un-encrypted copy is written. No other users have access to the encryption key.

  
function Decode-SecureString 
{   
    [CmdletBinding( PositionalBinding=$true )]
    [OutputType( [String] )]
    param ( [Parameter( Mandatory=$true, ValueFromPipeline=$true )]
            [System.Security.SecureString] $secureString )  
    begin 
    { $marshal = [System.Runtime.InteropServices.Marshal] }
    process 
    { $BSTR = $marshal::SecureStringToBSTR($secureString )
     $marshal::PtrToStringAuto($BSTR) } 
    end
    { $marshal::ZeroFreeBSTR($BSTR) }
}

Decode-EHSecureString $( ConvertTo-SecureString '1Qa@wSdE3$rFgT'  -AsPlainText -Force )

Powershell obfuscates Read-Host user input of type SecureString with the asterix (*) on the input screen. With the ISE you get a WPF input dialog that more clearly show the prompt but could also become annoying for command-line purists.

To evaluate the use of a Powershell installer, I coded a Powershell installation wrapper for the hub database installation scripts of the utility. The hub database needs 4 secrets: the passwords for four contained database users.  With this change it makes no sense to add an extra trip to the database to evaluate the hardness and SQL injection vulnerability for each secret. Instead, the SQL injection input validation logic from the T-SQL functions above it migrated to a Powershell Advanced Function – Advanced meaning that the function acts like a CmdLet – that accepts a SecureString.

  
1  function Test-EHSecureString 
2  {   
3   [CmdletBinding( PositionalBinding=$true )]
4      [OutputType( [Boolean] )]
5      param ( [Parameter( Mandatory=$true, ValueFromPipeline=$true )] 
6              [System.Security.SecureString] $secureString
7            , [Int32] $minLength = 14 
8            , [Int32] $minScore = 3 )  
9      begin 
10     { 
11         $marshal = [System.Runtime.InteropServices.Marshal] 
12     }
13     process 
14     {   # need the var to zero & free unencrypted copy of secret
15         [Int16] $score = 0
16         $BSTR = $marshal::SecureStringToBSTR($secureString)
17         if ( $marshal::PtrToStringAuto($BSTR).length -ge $minLength )
18         { 
19             switch -Regex ( $( $marshal::PtrToStringAuto($BSTR) ) )
20            {
21             '[#,.;:\\]+?' { Write-Warning ( 'character: {0}' -f $Matches[0] ); Break }
22             '(DROP|ADD|CREATE|SELECT|INSERT|UPDATE|DELETE|GRANT|REVOKE|RUNAS|ALTER)+?' 
23                           { Write-Warning ( 'SQL command: {0}' -f $Matches[0] ); Break }
24             '(AUX|CLOCK|COM[1-8]|CON|LPT[1-8]|NUL|PRN)+?' 
25                           { Write-Warning ( 'dos command: {0}' -f $Matches[0] ); Break } 
26             '(--|\*\/|\/\*)+?'{ Write-Warning ( 'comment: {0}' -f $Matches[0] ); Break }
27             '(?-i)[a-z]'  { $score+=1 }
28             '(?-i)[A-Z]'  { $score+=1 }
29             '\d+?'        { $score+=1 }
30             '\S\W+?'      { $score+=1 }
31             Default { Write-Warning $switch.current; Break }        
32            } 
33         }
34         else
35         { write-warning 
36                      ( 'length: {0}' -f $( $marshal::PtrToStringAuto($BSTR).length ) ) } 
37         write-warning ( 'score: {0}' -f $score )  
38         $( $score -ge $minScore )
39     }        
40     end { $marshal::ZeroFreeBSTR($BSTR) }
41 }

One thing for sure. Much less code required in Powershell than T-SQL to create a user. To securely invoke the function a Read-Host -AsSecureString prompts for user input that will go into the memory location allocated to the SecureString and only as encrypted data. Here the Powershell script will prompt for input until it gets an acceptable value. Remember, no one else will be able to decode this value in memory. Only the user that created the SecureString. Defense-in-Depth demands that care is be taken that the SecureString memory location is not taken for an off-line brute force interrogation.

 

do { $SQL_PASSWORD = $(Read-Host 'SQL_PASSWORD?' -AsSecureString ) } 
until ( $(Test-SecureStringHardness $HUB_ADMIN_PASSWORD ) )

Then the database is affected by the entered secret using SMO. In this case a user with password will be created in a contained database.

  
if ( $( Get-ChildItem -Name) -notcontains 'HubAdmin' )
{
    $HubAdmin = New-Object Microsoft.SqlServer.Management.Smo.User
    $HubAdmin.Parent = $smoHubDB
    $HubAdmin.Name = 'HubAdmin'     
    $HubAdmin.Create( $HUB_ADMIN_PASSWORD ) 
}

The Test-SecureString function does expose the clear text of the secret in the PSDebug trace stream during the switch operation as $switch. To my knowledge there is no way to obfuscate the value. On top of that the disposal of the $switch automatic variable is under garbage collection so there is no reliable way to know for sure when you can stop wondering is anyone found it. That uncertainty may be more of a risk than the SQLCMD exposure that the secure string is supposed to solve? On the other hand, the risk of exposure of the SQLCMD setvar values is undetermined so it would be silly to pretend to quantify an unknown risk. What I know for sure is those values have to touch primary storage buffers in order to load the variables and then populate the SQLCMD – $(VARIABLE_NAME) – tokens in the script. At least with the Powershell script I can quantify the risk and take all necessary precaution to mitigate. With SQLCMD setvar variables about all I can do to be certain my secrets are not fished out of the free pool or where ever else they might be buffered as clear text is remove the power. Even that is no guarantee the secrets are not leaked to a paging file, spooler or log while exposed internally or externally as clear text as the system shuts down.

At this point I’m convinced that it is best to validate Powershell SecureString input against SQL injection threats. The risk that someone will find the secrets in local memory and use them with malicious intent is far less than the risk from SQL injection in my estimation. I will continue to integrate Powershell into the install script with the goal of  using a Powershell installer. This is a much better input scenario than the T-SQL event obfuscation technique I had been using.

I will leave the SQL injection filters in the T-SQL functions to complement the Powershell filter for two reasons. Defense-in-depth and [sic]Defense-in-depth. 

Posted in Code Review, Data Loading, Secure Data, Testing | Leave a comment

TSQL Cryptographic Patterns – part 9: we’d better take this OFFLINE

There is a compelling defense-in-depth rationale for enabling AUTO_CLOSE on a database where sensitive data is stored.

ALTER DATABASE $(ANY_USER_DATABASE) SET AUTO_CLOSE ON;

When AUTO_CLOSE is ON the database will cleanly shutdown when the last active user session disconnects or moves to another database. To be cleanly shutdown means that a database can be opened again later without the need for SQL Server to run recovery on that database. Everything in the log has been processed into the data set. FWIW: we don’t get to decide when recovery runs, the database engine makes that determination. We do get to mess around with the CHECKPOINT a little more in SQL Server 2012 with the TARGET_RECOVERY_INTERVAL database option that overrides the server recovery interval. That actually does appears to be a step in the direction of exposing control of AUTO_CLOSE though probably not intentional.

Using AUTO_CLOSE is easy:

  • Once enable there is nothing to do.
  • The easiest way to tell if AUTO_CLOSE is ON is to query the is_auto_close_on column in sys.databases.
  • The easiest way to tell if a database with AUTO_CLOSE ON is cleanly shutdown is to query the is_cleanly_shutdown column in sys.databases.
  • The most eye-opening way to tell if the database is closed at the present time is to copy the .mdf or .ldf. If you can copy the files the database is cleanly shut down, if you cannot the database is open and accumulating resources i.e., data pages, locks, latches, versions, query plans, connections, etc..

(Note that there are a few respectable bloggers claiming that AUTO_CLOSE is marked for deprecation since SQL 2008. I believe there is some confusion. The blogs I have seen with this claim reference the SQL Server 2000 DMO AutoClose Property page as evidence. If you look, you will notice that all the DMO documentation pages for SQL Server 2000 carry the same deprecation warning. Pretty sure DMO is the deprecated technology not AUTO_CLOSE. I could be wrong.)

When a database cleanly shuts down all resources held for that database are freed from memory. Log records are processed such that no recovery is required when the database “opens”. Encryption keys and certificates are closed preventing any free rides on encryption hierarchies opened during legitimate use. DMV data collected from that database disappears. The file system locks on all log and data files are released. Data is flushed from cache. If TDE or file system encryption is in use, this moves all data behind that layer of obfuscation. The unloading is asynchronous, happening within 300ms.

The main difference between a database with AUTO_CLOSE ON when cleanly shutdown and an OFFLINE database is the AUTO part. That is, an administrator must manually transition the database between ONLINE and OFFLINE and back while AUTO_CLOSE automagically transitions the database between the unmodifiable state and the usable state for any valid request.

I notice that databases do not get the is_cleanly_shutdown bit set when the database is taken OFFLINE. While I cannot repro on demand, I also noticed that taking the test database ONLINE will force a recovery when that database goes back ONLINE every now and again. The documentation is clear that an OFFLINE database is cleanly shutdown. Wonder what’s up with that?


SELECT name, is_auto_close_on, state_desc, is_cleanly_shutdown
FROM sys.databases WHERE PATINDEX('Test%', name ) = 1;


name              is_auto_close_on state_desc  is_cleanly_shutdown
----------------- ---------------- ----------  ------------------- 
Test_OFFLINE                     0    OFFLINE                    0 
Test_AUTO_CLOSE                  1    ONLINE                     1 

The pooled connection overhead and bottlenecking that comes with AUTO_CLOSE are fairly well known. Most of the time that is about all one needs to know to avoid AUTO_CLOSE. The experts simply tell us to turn AUTO_CLOSE off and leave it at that. In fact, the best practice policies included in the SQL Server installation will disable AUTO_CLOSE on all databases.

Enabling the best practice policies is far better than not using policies or following the painful trajectory of trial and error to “find” the correct best practices. In all cases beware the dogma. A well-considered policy built upon best practices, patience and perststence is preferred.

Applications that create and store sensitive data are at risk of compromise if adequate considerations are not given to vulnerabilities that exploit the available SQL Server resource metadata and/or SQL Server primary storage buffers. The query cache, for example, can be helpful in understanding the data store and the data flow. This is useful information for man-in-the-middle, SQL Injection attackers or insider hi-jinx. Likewise, the sys.dm_exec_query_requests DMV or sys.sysprocesses compatibility view will point the uninitiated and uninvited to every client serviced by a database host. From there a SQL Injection attacker can map the application, identify weak hosts inside the DMZ and perhaps establish a SQL Injection based command line access targeting the weak internal node. The ways to be hacked are many.

The security implications of database resources are not normally considered in application design. If anything, database architectures error on the side of keeping resources loaded and exposed by making more memory available to the SQL Server. This increases the risks that cached data, data storage patterns, data flow patterns and cached query text can be mined for malicious purpose. To be sure, database resource exploits do not represent the low hanging fruit, but equally as certainly most of the low hanging fruit has by now been plucked. Nonetheless, within the context of a well-considered defense-in-depth data security architecture securing database resource access is essential. Presuming adequate system obfuscation of buffers in the free pool, releasing resources held in memory will provide a layer of protection against exploits of SQL Server memory space.

From another perspective: only if the storage location is secured and encrypted would it be wise to leverage AUTO_CLOSE as a security layer. Anyone with read access to the storage location can copy the database files when cleanly shutdown. An un-encrypted database file can also be opened in EMERGENCY mode (READONLY) on another SQL Server – illustrating the value of encrypted storage.

Applications with a relatively low rate of change and highly sensitive data, such as the Encryption Hierarchy Administration T-SQL utility that provided example T-SQL for this series and some witness protection relocation databases are candidates for the anti-sniffing and anti-hijacking protections afforded by resource unloading. Furthermore, when TDE is also configured and database resources are unloaded, the most complete benefit for TDE can be achieved. Under such conditions there are no back-doors or alternative access paths that can circumvent the encryption.

I decided to put it to a quick test. The test output below shows the resource caching behavior around AUTO_CLOSE’s clean shutdown and OFFLINE under 3 configuration scenarios:

  1. AUTO_CLOSE ON
  2. AUTO_CLOSE OFF
  3. AUTO_CLOSE ON with Active Service Broker

Connecting to BWUNDER-PC\ELEVEN...
Set database context to 'master'.
Microsoft SQL Server 2012 - 11.0.2100.60 (X64) 
    Feb 10 2012 19:39:15 
    Copyright (c) Microsoft Corporation
    Enterprise Evaluation Edition (64-bit) on Windows NT 6.1 <X64> 
(Build 7601: Service Pack 1)

CREATE DATABASE TestDb
OPEN MASTER KEY
CREATE DATABASE ENCRYPTION KEY
Warning: The certificate ~snip~ has not been backed up ~snip~
Set database context to 'tempdb'.
CREATE FUNCTION tempdb.dbo.fnServerResources
CREATE PROCEDURE tempdb.dbo.CheckResourcesFromTempDB
Set database context to 'TestDb'.
OPEN MASTER KEY
CREATE PROCEDURE TestDb.dbo.CheckDbResources
 
#1  SET AUTO_CLOSE ON                 -- database resources ----------------
                                      sessions   objects   q-stats     locks
try invoke AUTO_CLOSE                        3         2         0         2
Changed database context to 'master'.
wait a second...                             0         2         1         0
cleanly shutdown                             0         0         0         0
Changed database context to 'TestDb'.
initiate OFFLINE                             1         2         0         2
Changed database context to 'master'.
SET OFFLINE                                  0         0         0         0
SET ONLINE                                   1         1         0         2
 
#2  SET AUTO_CLOSE OFF                -- database resources ----------------
                                      sessions   objects   q-stats     locks
try invoke AUTO_CLOSE                        1         2         0         2
Changed database context to 'master'.
wait a second...                             0         2         1         0
not shutdown                                 0         2         1         0
Changed database context to 'TestDb'.
initiate OFFLINE                             1         2         1         2
Changed database context to 'master'.
SET OFFLINE                                  0         0         0         0
SET ONLINE                                   1         1         0         2
 
 Configure Service Broker
CREATE PROCEDURE TestDb.dbo.TestQActivationProcedure
CREATE QUEUE WITH ACTIVATION ON
CREATE SERVICE for QUEUE
CREATE EVENT NOTIFICATION to SERVICE
1 events enqueued
 
#3  SET AUTO_CLOSE ON                 --- database resources ---------------
                                      sessions   objects   q-stats     locks
try invoke AUTO_CLOSE                        2         3         1         3
Changed database context to 'master'.
wait a second...                             1         3         2         1
not shutdown                                 1         3         2         1
Changed database context to 'TestDb'.
initiate OFFLINE                             2         3         2         3
Changed database context to 'master'.
SET OFFLINE                                  0         0         0         0
SET ONLINE                                   1         1         0         2
 
Disconnecting connection from BWUNDER-PC\ELEVEN...

Unfortunately AUTO_CLOSE does not pass this sniff test. It simply is not reliable under many common SQL configurations. It is not consistently good at returning resources to a busy desktop or in closing keys. A persistent connection, a daemon process, a scheduled job, replication, mirroring or Service Broker activation – among other things – can interfere with a clean shutdown leaving the database memory work space and cache always available for malicious perusal. AUTO_CLOSE would too easily becomes a phantom security layer. You might find some comfort that it is enabled but you can never be certain that the protection is working.

The best way to be sure a database is shut down when idle is to take the database OFFLINE. That would also require a step to bring the database online before each use. Given that necessity, detaching the database would also work with the added advantage that the database reference is removed from any server scoped metadata in the catalog.

Posted in Encryption Hierarchies, Secure Data, Testing | Leave a comment