Human Generated Data

Title

Untitled (children watching man shaprening tools)

Date

1959

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15414

Human Generated Data

Title

Untitled (children watching man shaprening tools)

People

Artist: Jack Gould, American

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Machine 99.9
Wheel 99.9
Person 99
Human 99
Person 98.4
Bike 98.2
Transportation 98.2
Bicycle 98.2
Vehicle 98.2
Person 98
Person 98
Person 97.4
Person 92.9
Spoke 90.8
Wheel 89.3
Person 81.2
Person 73
Tire 66.4
People 62.4
Clothing 57.4
Apparel 57.4

Imagga
created on 2022-03-05

carriage 34.3
wheeled vehicle 31.9
cart 29.1
bicycle 27.9
wheelchair 26.1
vehicle 25.1
jinrikisha 24.5
wheel 21.9
old 20.2
tricycle 18.6
bike 18.5
transportation 17.9
wagon 17.2
chair 16.7
street 16.6
seat 15.9
city 15.8
man 15.4
spinning wheel 15
travel 14.8
transport 13.7
outdoor 13
people 12.8
urban 12.2
spinning machine 12.1
wall 12
cycle 11.7
male 11.3
outdoors 11.3
vacation 10.6
ride 10.4
horse 10.4
outside 10.3
sunset 9.9
disabled 9.9
retro 9.8
conveyance 9.8
sport 9.6
color 9.4
park 9.4
support 9.2
vintage 9.1
textile machine 9.1
aged 9
summer 9
recreation 9
person 8.6
device 8.3
sky 8.3
tourism 8.2
tourist 8
bicycle-built-for-two 7.9
pedal 7.9
black 7.8
silhouette 7.4
landscape 7.4
lifestyle 7.2
colorful 7.2
holiday 7.2
machine 7.1
day 7.1
sea 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

outdoor 99.7
ground 96.5
text 96.1
bicycle 92
wheel 90.1
land vehicle 81.2
vehicle 68.4
bicycle wheel 67.8
tire 50.6

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 99.8%
Calm 74.3%
Confused 16.9%
Angry 4.5%
Disgusted 1.5%
Surprised 1.2%
Sad 0.6%
Happy 0.5%
Fear 0.3%

AWS Rekognition

Age 6-12
Gender Female, 99.4%
Calm 68.2%
Happy 19.6%
Sad 8.4%
Angry 1.4%
Fear 0.8%
Surprised 0.7%
Confused 0.5%
Disgusted 0.4%

AWS Rekognition

Age 24-34
Gender Male, 98.5%
Calm 59.3%
Happy 37.8%
Confused 0.7%
Surprised 0.6%
Sad 0.6%
Angry 0.5%
Disgusted 0.4%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Wheel 99.9%
Person 99%
Bicycle 98.2%

Captions

Microsoft

a man and a woman sitting on a bicycle 53.9%
a person sitting on a bicycle 53.8%
a close up of a person with a bicycle in front of a building 53.7%

Text analysis

Amazon

SHAR
NED
SCISSOR
LAWNMOWERS
SCHILLER
POWER
393
HAND
POWER c HAND
SAWS
c
100LS

Google

SAWS
SCISSOR
AND
KNIVE
TOOLS
SHAR
SCHILL
TEE POWER LAWNMOWERS SAWS KNIVE SCISSOR AND TOOLS SHAR ENED SCHILL
TEE
POWER
LAWNMOWERS
ENED