Human Generated Data

Title

Untitled (six teenage students painting in art classroom with teacher sitting at table)

Date

1952

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9390

Human Generated Data

Title

Untitled (six teenage students painting in art classroom with teacher sitting at table)

People

Artist: Martin Schweig, American 20th century

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.6
Human 99.6
Restaurant 99.3
Person 98.8
Person 98.1
Person 95.2
Person 93.8
Cafeteria 85.2
Cafe 84.8
Chair 82.6
Furniture 82.6
Food Court 81.8
Food 81.8
Person 77.9
Meal 75.1
Apparel 74
Clothing 74
Art 59.4
Indoors 59.2
Room 59.2

Imagga
created on 2022-01-23

marimba 100
percussion instrument 87.2
musical instrument 67.2
room 42
classroom 36.7
man 35.6
people 30.7
male 29.1
table 26
business 23.1
sitting 22.3
person 22.2
businessman 22.1
smiling 20.3
women 19.8
men 19.7
office 18.2
meeting 17.9
adult 17.5
professional 16.6
couple 16.5
businesswoman 16.4
lifestyle 15.9
happy 15.7
group 15.3
team 15.2
teacher 15.1
indoors 14.9
colleagues 14.6
talking 14.3
businesspeople 14.2
together 14
mature 13.9
restaurant 13.8
chair 13.5
day 13.3
interior 13.3
work 12.6
job 12.4
portrait 12.3
cheerful 12.2
teamwork 12
corporate 12
worker 11.6
working 11.5
desk 11.3
20s 11
holding 10.7
smile 10.7
color 10.6
education 10.4
side 10.3
happiness 10.2
communication 10.1
modern 9.8
casual clothing 9.8
40s 9.7
to 9.7
home 9.6
executive 9.5
adults 9.5
casual 9.3
suit 9
outdoors 9
family 8.9
leisure activity 8.8
discussion 8.8
two people 8.7
standing 8.7
30s 8.7
four 8.6
ethnic 8.6
glass 8.6
togetherness 8.5
enjoyment 8.4
manager 8.4
mother 8.4
house 8.4
confident 8.2
new 8.1
life 8
computer 8
medical 7.9
love 7.9
20 24 years 7.9
discussing 7.9
drinking 7.7
two 7.6
shop 7.6
hall 7.6
wine 7.4
barbershop 7.3
occupation 7.3
indoor 7.3
laptop 7.3
looking 7.2
handsome 7.1

Google
created on 2022-01-23

Clothing 98.7
Furniture 93.2
Table 93
Black-and-white 85.9
Dress 83.8
Art 83.6
Suit 78.3
Vintage clothing 76.5
Monochrome 75.1
Monochrome photography 74.8
Painting 74.7
Chair 72.7
Room 69.9
Design 68.4
Font 67.8
History 67.6
Visual arts 67.1
Classic 65.4
Illustration 64.8
Event 64.6

Microsoft
created on 2022-01-23

table 98
text 96.9
furniture 93.5
chair 79.2
clothing 75.4
dress 71.2
woman 68.9
person 62.7
vase 54.6
house 53.6
wedding dress 50.7

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Male, 74.2%
Calm 74.6%
Sad 8.9%
Surprised 8.4%
Fear 3.1%
Confused 2.8%
Angry 1.2%
Disgusted 0.7%
Happy 0.3%

AWS Rekognition

Age 23-33
Gender Male, 81.4%
Calm 98.1%
Happy 0.5%
Disgusted 0.4%
Sad 0.3%
Angry 0.3%
Surprised 0.2%
Confused 0.2%
Fear 0.1%

AWS Rekognition

Age 26-36
Gender Male, 97.4%
Calm 67%
Happy 11.8%
Confused 8.3%
Sad 7.4%
Angry 2.2%
Disgusted 1.7%
Surprised 1%
Fear 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Chair 82.6%

Captions

Microsoft

a group of people in a room 93.9%
a group of people standing in a room 91.8%
a group of people standing in front of a store 76.6%

Text analysis

Amazon

SHOW
KODOK-EVEELA
veces
П.С.С.

Google

p
2aa
a
SHOW
G SHOW 2aa a p MAGO-
G
MAGO-