Human Generated Data

Title

Untitled (group portrait of female yearbook staff seated around table with typewriter while looking yearbooks)

Date

1952-1953

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6573

Human Generated Data

Title

Untitled (group portrait of female yearbook staff seated around table with typewriter while looking yearbooks)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1952-1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6573

Machine Generated Data

Tags

Amazon
created on 2019-03-26

Human 99.7
Person 99.7
Person 99.6
Person 99.3
Person 99.1
Person 99
Person 98.8
Person 97.3
Person 96.9
Clothing 92.4
Apparel 92.4
Chair 81
Furniture 81
Person 73.4
Accessories 71.8
Accessory 71.8
Sunglasses 71.8
People 71.4
Coat 68.8
Overcoat 68.8
Suit 68.8
Table 68.5
Photo 62.9
Photography 62.9
Face 59
Stage 58.8
Indoors 58.3
Room 58.3
Crowd 58.2
Clinic 57.2
Musical Instrument 56.4
Leisure Activities 56.4
Piano 56.4
Person 50.4

Clarifai
created on 2019-03-26

people 99.7
group together 99.1
group 98.5
leader 97.6
adult 96.8
administration 95.8
man 95.7
chair 95.3
woman 94.6
furniture 94
several 91
three 90.7
wear 89.7
two 87.2
four 87.2
five 86.9
actor 85.7
facial expression 83.8
many 82.1
music 81.1

Imagga
created on 2019-03-26

musical instrument 52.2
man 36.9
male 31.9
wind instrument 31.8
person 31.3
people 30.1
percussion instrument 28.1
brass 26.8
adult 25.6
businessman 24.7
office 24.1
men 24
business 23.7
marimba 22.6
group 20.9
professional 20.7
teacher 19.3
smiling 18.8
happy 18.8
meeting 17.9
table 16.4
classroom 16.1
team 16.1
desk 16
work 15.7
sitting 15.4
executive 15.1
women 15
senior 15
room 14.6
couple 13.9
chair 13.6
smile 13.5
businesspeople 13.3
worker 13.2
education 13
indoors 12.3
portrait 12.3
mature 12.1
teamwork 12
corporate 12
job 11.5
cheerful 11.4
manager 11.2
businesswoman 10.9
device 10.8
conference 10.7
student 10.4
looking 10.4
learning 10.3
indoor 10
confident 10
concertina 9.9
suit 9.9
discussion 9.7
interior 9.7
together 9.6
lifestyle 9.4
communication 9.2
inside 9.2
laptop 9.1
free-reed instrument 9
board 9
cornet 9
black 9
school 9
handsome 8.9
home 8.8
class 8.7
vibraphone 8.7
tie 8.5
holding 8.2
successful 8.2
success 8
blackboard 7.9
paper 7.8
standing 7.8
hands 7.8
teaching 7.8
40s 7.8
colleagues 7.8
modern 7.7
30s 7.7
diversity 7.7
collar 7.7
workplace 7.6
two 7.6
talking 7.6
coffee 7.4
computer 7.3
educator 7.2
to 7.1

Google
created on 2019-03-26

Photograph 94.8
Table 83.8
Photography 62.4
History 60.4
Family 57.7
Furniture 56.7
Sitting 54.1
Team 52.3

Microsoft
created on 2019-03-26

person 99
people 68.6
dish 40.6
music 4.6
assembly 2.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 51.3%
Confused 45.2%
Surprised 45.1%
Sad 50.4%
Disgusted 45%
Happy 45%
Calm 49.1%
Angry 45.2%

AWS Rekognition

Age 35-52
Gender Female, 53.1%
Disgusted 45.2%
Happy 45.4%
Surprised 45.9%
Angry 45.8%
Sad 49.8%
Confused 45.7%
Calm 47.2%

AWS Rekognition

Age 26-43
Gender Male, 53.4%
Calm 53.6%
Happy 45.2%
Sad 45.3%
Angry 45.3%
Disgusted 45.1%
Confused 45.2%
Surprised 45.2%

AWS Rekognition

Age 35-53
Gender Male, 50.3%
Calm 52.4%
Happy 45.1%
Angry 45.8%
Disgusted 45.3%
Confused 45.2%
Sad 46%
Surprised 45.2%

AWS Rekognition

Age 35-52
Gender Male, 53.6%
Sad 45.6%
Disgusted 45.1%
Happy 45.2%
Calm 53.7%
Angry 45.1%
Confused 45.1%
Surprised 45.1%

AWS Rekognition

Age 26-43
Gender Female, 52.3%
Sad 47.3%
Calm 51.8%
Confused 45.2%
Happy 45%
Disgusted 45.1%
Surprised 45.2%
Angry 45.4%

AWS Rekognition

Age 26-43
Gender Male, 54.9%
Confused 45.2%
Angry 45.2%
Surprised 45.3%
Disgusted 45.1%
Calm 53.8%
Happy 45.1%
Sad 45.3%

AWS Rekognition

Age 38-59
Gender Male, 50.9%
Surprised 46%
Angry 45.9%
Disgusted 45.6%
Happy 45.7%
Sad 49.1%
Confused 45.7%
Calm 47.1%

Feature analysis

Amazon

Person 99.7%
Chair 81%
Sunglasses 71.8%
Piano 56.4%

Categories

Text analysis

Amazon

KODEK-ELA