Human Generated Data

Title

Untitled (two fireman and small dog next to firetruck)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14516

Human Generated Data

Title

Untitled (two fireman and small dog next to firetruck)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Human 97.1
Person 97.1
Car 96.9
Automobile 96.9
Vehicle 96.9
Transportation 96.9
Spoke 94.5
Machine 94.5
Wheel 93.7
Person 87.9
Tire 87.1
Wheel 86.5
Alloy Wheel 84.1
Person 77.5
People 72
Car Wheel 70.6
Sports Car 66.7
Shorts 63.9
Clothing 63.9
Apparel 63.9
Face 61.4
Coat 58
Suit 58
Overcoat 58
Race Car 56.6
Sedan 56.5
Hot Rod 55.2
Person 45
Person 43.3
Person 41.9

Imagga
created on 2022-01-29

car 65.7
vehicle 59.4
motor vehicle 43.2
transportation 37.6
transport 35.6
auto 34.4
automobile 32.5
road 28.9
drive 27.4
speed 26.6
wheel 25
truck 24.4
amphibian 22.7
engine 22.1
motor 21.3
wheeled vehicle 21.2
power 18.5
aircraft 18.4
luxury 17.1
classic 16.7
driving 16.4
metal 16.1
race 15.3
machine 14.2
chrome 14.1
sky 14
fast 14
racing 13.7
aviation 13.7
plane 13.5
airplane 13.5
military 13.5
war 13.5
old 13.2
vintage 13.2
street 12.9
style 12.6
retro 12.3
tire 12.2
sport 12.2
light 12
sports 12
design 11.8
cars 11.7
flight 11.5
urban 11.4
travel 11.3
technology 11.1
military vehicle 10.8
city 10.8
expensive 10.5
elegant 10.3
device 10.2
black 10.2
jet 10
fire engine 9.9
propeller 9.8
automotive 9.8
wheels 9.8
emergency 9.6
racer 9.5
traffic 9.5
show 9.5
aircraft carrier 9.3
antique 9.2
air 9.2
landing 8.8
pilot 8.8
force 8.8
army 8.8
driver 8.7
motion 8.6
warship 8.4
modern 8.4
land 8.3
airport 8.2
ship 8.2
reflection 8.1
detail 8
building 7.9
helicopter 7.9
day 7.8
color 7.8
model 7.8
highway 7.7
moving 7.6
fly 7.5
rotor 7.3
people 7.2
headlight 7.1
steel 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

road 97.9
text 97.7
land vehicle 94.5
vehicle 93.1
outdoor 91.3
wheel 87.6
car 86.2
transport 76.8
military vehicle 54.1

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 58.8%
Calm 27%
Confused 18.9%
Disgusted 17.3%
Sad 13.5%
Angry 12.3%
Fear 7.4%
Happy 2%
Surprised 1.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.1%
Car 96.9%
Wheel 93.7%

Captions

Microsoft

a group of people riding on the back of a truck 73.2%
a group of men riding on the back of a truck 65%
a person riding on the back of a truck 64.9%

Text analysis

Amazon

a
MJI7
MJI7 YT37A2 173/
173/
YT37A2

Google

Y
MJI7
MJI7 Y