Human Generated Data

Title

Untitled (large room full of dentists examining patients)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14703

Human Generated Data

Title

Untitled (large room full of dentists examining patients)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Human 96.1
Person 96.1
Person 96
Furniture 89.8
Indoors 86.2
Room 86.2
Person 81.1
Person 80.3
Food 80.2
Meal 80.2
Person 78.3
Chair 76.5
Table 73.8
Dining Table 73.6
Crowd 63.2
Person 63.1
People 61.3
Restaurant 61.1
Cafeteria 61.1
Dish 59.6
Suit 58.1
Coat 58.1
Overcoat 58.1
Clothing 58.1
Apparel 58.1
Art 56.6
Clinic 55.9
Ballroom 55.4
Reception 55.2
Waiting Room 55.2
Reception Room 55.2

Imagga
created on 2022-01-29

shop 35.1
table 26.9
barbershop 26.5
restaurant 26
glass 26
mercantile establishment 25.8
decoration 20.9
interior 19.4
wedding 19.3
setting 19.3
room 18.8
dinner 18.7
celebration 18.3
place of business 17.3
party 16.3
decor 15.9
banquet 15.7
people 13.9
drink 13.4
dining 13.3
service 13
event 12.9
luxury 12.9
wine 12
reception 11.7
napkin 11.7
boutique 11.5
fork 11.5
glasses 11.1
silverware 10.8
catering 10.8
cutlery 10.7
knife 10.6
shoe shop 10.5
flowers 10.4
plate 10.2
holiday 10
light 10
bouquet 9.7
hotel 9.5
chair 9.5
elegant 9.4
clothing 9.4
building 9.3
place 9.3
life 9.1
food 9.1
modern 9.1
business 9.1
dine 8.8
stall 8.6
establishment 8.6
lunch 8.6
travel 8.4
design 8.4
meal 8.3
window 8.3
home 8
indoors 7.9
work 7.8
arrangement 7.8
vase 7.8
ceremony 7.8
gift 7.7
fancy 7.7
set 7.6
marriage 7.6
alcohol 7.6
eat 7.5
tradition 7.4
structure 7.4
tourist 7.2
dish 7.2
black 7.2
person 7.2
medical 7.1
architecture 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 94.4
table 75.5
furniture 52.2

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 94.6%
Calm 67%
Happy 16.4%
Sad 9.7%
Fear 2.6%
Angry 2.1%
Disgusted 0.8%
Confused 0.7%
Surprised 0.7%

AWS Rekognition

Age 26-36
Gender Male, 78.8%
Calm 60.9%
Sad 18.6%
Happy 16.6%
Confused 1.9%
Fear 0.6%
Angry 0.6%
Surprised 0.4%
Disgusted 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.1%

Captions

Microsoft

a group of people in a room 87.3%
a group of people in a store 62.6%
a group of people standing in a room 62.5%

Text analysis

Amazon

MJI7
MJI7 YT37AS ALSAN
YT37AS
ALSAN

Google

YT3RA
A
MJ17 YT3RA A
MJ17