Human Generated Data

Title

Untitled (two men posed breaking ground on sidewalk next to street marker and jackhammer)

Date

1949

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6247

Human Generated Data

Title

Untitled (two men posed breaking ground on sidewalk next to street marker and jackhammer)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6247

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Wheel 99.9
Machine 99.9
Person 99.7
Human 99.7
Person 98.4
Tire 90.3
Spoke 81.6
Transportation 79.6
Car 79.6
Automobile 79.6
Vehicle 79.6
Outdoors 79.3
Tarmac 70.4
Asphalt 70.4
Nature 69.7
Car Wheel 68.8
Field 59.7
Countryside 56.3

Clarifai
created on 2023-10-26

monochrome 99.7
people 99.5
vehicle 97.6
tractor 96.9
adult 92.9
street 92.7
winter 92.2
exert 92.1
man 90.8
black and white 90.7
machine 90.3
snow 89
two 88.5
group together 88.3
action 88.1
child 87.3
transportation system 85.3
one 85.2
group 85
recreation 81.2

Imagga
created on 2022-01-22

truck 41.5
vehicle 32.4
artillery 29.7
cannon 28.7
motor vehicle 27.8
car 24.3
plow 24.1
field artillery 21.4
armament 21.3
high-angle gun 21
tractor 18.5
tool 18
wheel 17.9
sky 17.9
landscape 17.8
field 17.6
work 17.3
machine 17.2
road 16.3
transportation 16.1
farm 16.1
outdoor 16.1
old 15.3
outdoors 14.9
man 14.8
industrial 14.5
industry 14.5
wheeled vehicle 14.4
grass 14.2
rural 14.1
machinery 13.6
gun 13.6
tow truck 13.3
equipment 13.3
environment 13.2
tire 13.1
weaponry 12.9
fire engine 12.8
working 12.4
agriculture 12.3
transport 11.9
trailer 11.7
male 10.6
trailer truck 10.5
heavy 10.5
summer 10.3
safety 10.1
protection 10
dirty 9.9
person 9.9
country 9.7
auto 9.6
farming 9.5
smoke 9.3
travel 9.2
countryside 9.1
destruction 8.8
weapon 8.8
driving 8.7
automobile 8.6
outside 8.6
construction 8.6
drive 8.5
people 8.4
action 8.3
hay 8.3
danger 8.2
horizon 8.1
farmer 8
scenic 7.9
disaster 7.8
antique 7.8
men 7.7
vintage 7.4
sport 7.4
land 7.4
cart 7.3
building 7.1
snow 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

outdoor 99.3
tree 98.4
text 96.8
black and white 84.5
wheel 83.1
vehicle 82.5
land vehicle 81.8
tire 76.1
white 67.1
monochrome 61.7
person 50.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 92.2%
Calm 77.3%
Surprised 7%
Happy 6.3%
Confused 4%
Sad 2.8%
Disgusted 1.3%
Angry 0.8%
Fear 0.5%

AWS Rekognition

Age 18-26
Gender Male, 90.3%
Calm 59.4%
Disgusted 16.1%
Confused 9.1%
Surprised 6.6%
Sad 3.4%
Fear 3%
Happy 1.7%
Angry 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Wheel 99.9%
Person 99.7%
Car 79.6%

Categories

Text analysis

Amazon

SANBORN

Google

SANBORZ
SANBORZ