Human Generated Data

Title

Untitled (women in industrial kitchen)

Date

1956

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19601

Human Generated Data

Title

Untitled (women in industrial kitchen)

People

Artist: Samuel Cooper, American active 1950s

Date

1956

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.6
Person 99.6
Person 99.3
Restaurant 98.9
Person 98.6
Person 98.3
Person 97.4
Building 91.8
Cafe 91.1
Cafeteria 87.6
Factory 84.7
Workshop 84.7
Person 78.3
Food 69
Meal 69
Food Court 56.6
Furniture 55.3
Table 55.3

Imagga
created on 2022-03-05

barbershop 54.1
shop 47.1
interior 38
restaurant 37.5
chair 37.2
mercantile establishment 33
room 31.7
table 29.4
modern 28
furniture 23.5
counter 22.8
place of business 22.7
building 21.7
office 19.1
work 18.8
indoors 18.4
business 18.2
people 17.8
design 16.3
man 16.1
structure 15.4
seat 15.1
house 15
architecture 14.8
barroom 14.7
inside 14.7
chairs 14.7
indoor 14.6
cafeteria 14
urban 14
empty 13.7
men 13.7
floor 13
classroom 12.8
kitchen 12.5
industry 11.9
city 11.6
steel 11.5
comfortable 11.5
establishment 11.4
light 11.4
equipment 11.2
home 11.2
person 11.2
bar 11.1
dinner 10.9
glass 10.9
wood 10.8
male 10.6
decor 10.6
dining 10.5
life 10.2
decoration 10.1
tables 9.8
center 9.8
working 9.7
food 9.7
contemporary 9.4
lifestyle 9.4
computer 8.8
sitting 8.6
barber chair 8.5
coffee 8.3
service 8.3
occupation 8.2
window 8.2
worker 8.1
transportation 8.1
group 8.1
machine 8
women 7.9
scene 7.8
travel 7.7
3d 7.7
television camera 7.7
luxury 7.7
professional 7.6
communication 7.6
elegance 7.6
drink 7.5
place 7.4
technology 7.4
hall 7.4
board 7.2
adult 7.1

Google
created on 2022-03-05

Table 93.8
Furniture 93.8
Laptop 85.6
Black-and-white 85.4
Chair 83.9
Style 83.9
Computer 82.6
Building 79.6
Desk 76.4
Monochrome photography 73
Monochrome 71.4
Event 68.8
Room 67.6
City 67
Sitting 66.1
Machine 65.9
Coffee table 64.1
Street 63.1
Factory 62.8
T-shirt 60.5

Microsoft
created on 2022-03-05

indoor 94.5
text 92.6
black and white 87.3
person 86.6
piano 72.8
clothing 63.6
man 57.7
restaurant 35.5

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Female, 58.5%
Happy 44.7%
Surprised 30.1%
Sad 7%
Calm 7%
Angry 5.7%
Confused 1.9%
Fear 1.9%
Disgusted 1.7%

AWS Rekognition

Age 39-47
Gender Male, 99.9%
Confused 49.5%
Calm 31.4%
Sad 6.1%
Fear 4.7%
Happy 3.1%
Surprised 2.7%
Disgusted 1.3%
Angry 1.1%

AWS Rekognition

Age 48-54
Gender Male, 51.7%
Sad 78.4%
Happy 16.5%
Calm 4.1%
Confused 0.2%
Fear 0.2%
Disgusted 0.2%
Surprised 0.2%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a group of people sitting at a table in a restaurant 87.8%
a group of people sitting at a table 87.3%
a group of people sitting at a table with a laptop 82.6%

Text analysis

Amazon

Tam
Tam Tam
KEEP
3
-YT
TamT
KEEP DRY
DOST
DRY
36-1
PACKAGES
36-1 oz
oz
36-8 oz
36-8
MADE
PAILAGES

Google

Tam Tam Blle Tam T
T
Tam
Blle