Human Generated Data

Title

Untitled (children on firetruck)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16653

Human Generated Data

Title

Untitled (children on firetruck)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-18

Person 98.1
Human 98.1
Wheel 97.9
Machine 97.9
Person 96.5
Person 95.8
Person 91.7
Person 91
Vehicle 88.9
Transportation 88.9
Road 88.7
Person 88.4
Person 88.1
Car 87.9
Automobile 87.9
Person 87.5
Car 87.1
Person 87
Ship 86.4
Person 85.9
Person 85.8
Person 85
Car 84.9
Person 83.7
Car 83.2
Car 81.5
Person 81.5
Tarmac 76.7
Asphalt 76.7
Pedestrian 74.8
Person 69.3
Crowd 69.2
Nature 67.6
Person 67.5
Military 64.7
Outdoors 63.7
Navy 63.6
Cruiser 63.6
Photography 61.8
Face 61.8
Photo 61.8
Portrait 61.8
Path 60.1
Freeway 58.7
Urban 58.2
City 58.2
Town 58.2
Street 58.2
Building 58.2
Battleship 57.6
Shorts 57
Apparel 57
Clothing 57
Person 51.9
Person 50.9

Imagga
created on 2022-02-18

passenger 22.2
sky 21
vehicle 19.2
city 16.6
architecture 16.4
destruction 15.6
black 15
sea 14.8
power 14.3
building 13.7
danger 13.6
ship 13.4
industrial 12.7
water 12.7
wheeled vehicle 12.6
machine 12.5
war 12.4
radio telescope 12.1
travel 12
seller 11.8
motor vehicle 11.3
landscape 11.2
street 11
park 10.9
tourism 10.7
environment 10.7
nuclear 10.7
military 10.6
old 10.4
car 10.4
weapon 10.4
construction 10.3
industry 10.2
road 9.9
history 9.8
astronomical telescope 9.7
gas 9.6
urban 9.6
cloud 9.5
beach 9.4
clouds 9.3
ocean 9.1
protection 9.1
dirty 9
pollution 8.6
day 8.6
gun 8.6
cannon 8.6
truck 8.2
artillery 8.2
coast 8.1
sand 7.9
summer 7.7
craft 7.7
outdoor 7.6
field 7.5
man 7.4
vacation 7.4
airship 7.3
telescope 7.2
landmark 7.2
to 7.1

Google
created on 2022-02-18

Microsoft
created on 2022-02-18

Face analysis

Amazon

AWS Rekognition

Age 27-37
Gender Male, 99.2%
Calm 46.6%
Happy 26.2%
Disgusted 14.6%
Sad 5.5%
Fear 2.5%
Angry 2.5%
Surprised 1.4%
Confused 0.8%

AWS Rekognition

Age 14-22
Gender Male, 67.7%
Calm 90.4%
Sad 3%
Confused 2%
Angry 1.5%
Happy 1.2%
Disgusted 1.1%
Fear 0.6%
Surprised 0.2%

AWS Rekognition

Age 23-33
Gender Male, 55.2%
Calm 80.8%
Sad 9.4%
Happy 5.7%
Confused 1.1%
Disgusted 1%
Surprised 0.7%
Fear 0.6%
Angry 0.5%

AWS Rekognition

Age 11-19
Gender Female, 76.4%
Happy 50.2%
Calm 46.3%
Fear 1%
Surprised 0.8%
Sad 0.7%
Disgusted 0.6%
Angry 0.2%
Confused 0.1%

AWS Rekognition

Age 16-22
Gender Female, 78.2%
Calm 45.1%
Happy 40%
Sad 6.1%
Confused 4.6%
Angry 1.6%
Fear 1%
Surprised 0.9%
Disgusted 0.8%

AWS Rekognition

Age 18-26
Gender Male, 95.4%
Fear 39.1%
Calm 23.6%
Sad 21.9%
Angry 5.8%
Happy 5.6%
Disgusted 2.2%
Surprised 1.1%
Confused 0.7%

AWS Rekognition

Age 21-29
Gender Female, 84.8%
Calm 78%
Sad 13.5%
Confused 3.6%
Happy 1.6%
Disgusted 1%
Surprised 1%
Angry 0.9%
Fear 0.3%

AWS Rekognition

Age 19-27
Gender Male, 93.8%
Calm 75.6%
Angry 6.6%
Sad 5.9%
Happy 4%
Confused 3.7%
Fear 2.2%
Disgusted 1.1%
Surprised 0.9%

AWS Rekognition

Age 23-33
Gender Male, 86%
Calm 95.6%
Fear 1.4%
Angry 1%
Happy 0.8%
Sad 0.4%
Disgusted 0.3%
Surprised 0.3%
Confused 0.2%

AWS Rekognition

Age 18-26
Gender Female, 99.5%
Happy 39.5%
Sad 29.4%
Calm 14.4%
Fear 7.9%
Angry 3%
Surprised 2.9%
Disgusted 1.8%
Confused 1.1%

AWS Rekognition

Age 40-48
Gender Female, 51.1%
Happy 73.2%
Calm 9.2%
Fear 8%
Sad 5.5%
Confused 2%
Disgusted 1.1%
Angry 0.9%
Surprised 0.2%

Feature analysis

Amazon

Person 98.1%
Wheel 97.9%
Car 87.9%

Captions

Microsoft

a group of people in a vehicle 77.2%
a group of people riding on the back of a truck 60.4%
a group of people riding on the back of a vehicle 59.5%

Text analysis

Amazon

ZATA
NACO
191 УТЭЗАС NACO
УТЭЗАС
191
11 LEPT

Google

ETERE
ETERE