Human Generated Data

Title

Untitled (people dining at table on veranda)

Date

c. 1975, from c. 1931 original

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21923

Human Generated Data

Title

Untitled (people dining at table on veranda)

People

Artist: Hamblin Studio, American active 1930s

Date

c. 1975, from c. 1931 original

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Furniture 99.6
Chair 99.6
Chair 98.9
Human 98.7
Person 98.7
Restaurant 97.6
Person 97.4
Person 94.2
Person 94.1
Person 93.3
Room 92.8
Indoors 92.8
Meal 89.3
Food 89.3
Cafeteria 85.6
Chair 84.4
Cafe 81.1
Table 80.4
Dining Table 80.4
Dining Room 80.3
Person 79.2
Person 78.1
Living Room 75.1
Porch 73.5
Floor 71.9
People 71
Flooring 68.7
Wheel 67.5
Machine 67.5
Couch 67.1
Interior Design 62.8
Dish 59.6

Imagga
created on 2022-03-11

restaurant 81.5
cafeteria 78.2
building 51.5
interior 49.5
chair 46.4
table 37.7
structure 36.2
room 36
furniture 32
house 31.8
home 23.9
inside 23.9
modern 23.8
floor 22.3
counter 21.9
indoor 21
dining 20.9
design 20.8
decor 19.4
wood 19.2
shop 18.8
kitchen 18.2
architecture 18
glass 17.3
window 17.1
chairs 16.6
decoration 16.6
barbershop 16.5
style 16.3
dinner 16
contemporary 15
barroom 14.7
mercantile establishment 13.8
empty 13.7
comfortable 13.4
light 13.4
indoors 13.2
seat 13.1
stool 12.8
food 12.7
drink 12.5
luxury 12
plant 11.9
decorate 11.4
urban 11.4
bar 11.1
residential 10.5
lunch 10.5
business 10.3
place 10.2
eat 10.1
drawer 9.9
tables 9.8
lamp 9.6
hotel 9.5
relax 9.3
place of business 9.2
city 9.1
oven 9
meal 8.9
cabinets 8.9
refrigerator 8.9
stove 8.9
residence 8.8
scene 8.7
patio 8.6
party 8.6
elegant 8.6
office 8.5
service 8.3
stylish 8.1
sun 8.1
classroom 8
plate 7.6
mirror 7.6
tile 7.6
work 7.6
estate 7.6
cook 7.3
wooden 7

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 65.7%
Calm 44.3%
Happy 32.7%
Surprised 5.9%
Fear 5.8%
Sad 4.4%
Angry 2.8%
Disgusted 2.6%
Confused 1.5%

AWS Rekognition

Age 53-61
Gender Male, 55.4%
Sad 42.5%
Surprised 18.3%
Calm 18%
Fear 9.4%
Angry 6.8%
Disgusted 3.1%
Confused 1.5%
Happy 0.5%

AWS Rekognition

Age 6-12
Gender Male, 66.5%
Calm 91.4%
Fear 3.3%
Sad 2.4%
Surprised 0.9%
Angry 0.8%
Confused 0.5%
Disgusted 0.3%
Happy 0.3%

AWS Rekognition

Age 23-33
Gender Male, 99.9%
Calm 79.2%
Fear 12.9%
Surprised 2.5%
Confused 1.9%
Sad 1.9%
Angry 0.6%
Disgusted 0.6%
Happy 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair 99.6%
Person 98.7%
Wheel 67.5%

Captions

Microsoft

a group of people in a store window 49.4%
a group of people in a room 49.3%
a group of people sitting at a table in front of a window 48.5%

Text analysis

Google

TRIOHE
SIRW
VORE TRIOHE DO SHECOBIS D M SIRW
SHECOBIS
D
VORE
DO
M