Human Generated Data

Title

Untitled (group portrait of female yearbook staff seated around table with typewriter while looking yearbooks)

Date

1952-1953

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6574

Human Generated Data

Title

Untitled (group portrait of female yearbook staff seated around table with typewriter while looking yearbooks)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1952-1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6574

Machine Generated Data

Tags

Amazon
created on 2019-03-26

Human 99.5
Person 99.5
Person 99.4
Person 99.1
Person 98.6
Person 98.4
Person 97.6
Clothing 95
Apparel 95
Person 94.8
Person 92.5
Clinic 82.4
Furniture 79.8
Chair 79.8
Indoors 71.8
Room 71.8
Coat 68.3
Overcoat 68.1
Suit 68.1
Footwear 68
Shoe 68
Table 66.4
Crowd 63.8
Photography 60.8
Photo 60.8
Sleeve 59.7
Stage 57.3
Face 56.7

Clarifai
created on 2019-03-26

people 99.8
group together 98.8
adult 98.6
group 96.1
man 95.1
woman 95.1
leader 94.7
administration 93.1
furniture 92
several 91.6
wear 88.2
three 87.4
chair 86.2
facial expression 82.3
four 82
five 81.5
many 81.4
stump 80.2
education 77.8
sports equipment 77.6

Imagga
created on 2019-03-26

man 36.9
male 34.7
person 31.5
people 29.5
musical instrument 28.1
brass 25.9
adult 24
wind instrument 23.5
business 21.2
teacher 19
office 18.4
classroom 18
men 18
businessman 17.6
happy 17.5
group 16.9
professional 16.6
job 15
work 14.9
room 14.8
smiling 14.4
black 13.8
worker 13.6
education 13
table 13
indoors 12.3
portrait 12.3
meeting 12.2
executive 12
women 11.8
percussion instrument 11.7
chair 11.7
team 11.6
desk 11.3
senior 11.2
sitting 11.2
businesswoman 10.9
lifestyle 10.8
cheerful 10.6
couple 10.4
businesspeople 10.4
smile 10
handsome 9.8
student 9.5
mature 9.3
teamwork 9.3
clothing 9.1
indoor 9.1
hand 9.1
holding 9.1
school 9
corporate 8.6
helmet 8.5
two 8.5
blackboard 8.4
laptop 8.3
camera 8.3
marimba 8.2
human 8.2
board 8.2
confident 8.2
interior 7.9
explaining 7.9
class 7.7
diversity 7.7
casual 7.6
communication 7.5
learning 7.5
showing 7.5
manager 7.4
entertainment 7.4
occupation 7.3
dress 7.2
home 7.2
game 7.1
night 7.1
to 7.1
sport 7.1
happiness 7

Google
created on 2019-03-26

Photograph 95.3
Table 79
Games 72.9
Room 65.7
Photography 62.4
Black-and-white 56.4

Microsoft
created on 2019-03-26

person 99
people 72.2
dish 52.9
ballet 20
music 13.8
group 5.7
jazz 5.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 51.6%
Disgusted 45.2%
Angry 45.7%
Surprised 45.4%
Calm 52%
Happy 45.1%
Sad 46.2%
Confused 45.3%

AWS Rekognition

Age 45-66
Gender Female, 50.1%
Confused 45.2%
Angry 45.6%
Surprised 45.3%
Disgusted 45.2%
Calm 52%
Happy 45.3%
Sad 46.4%

AWS Rekognition

Age 26-43
Gender Female, 50.5%
Happy 45.1%
Sad 48.8%
Calm 50%
Angry 45.7%
Confused 45.2%
Disgusted 45.1%
Surprised 45.1%

AWS Rekognition

Age 45-66
Gender Male, 52.6%
Angry 45.2%
Happy 45.2%
Sad 47.4%
Calm 52%
Confused 45.2%
Disgusted 45%
Surprised 45.1%

AWS Rekognition

Age 48-68
Gender Male, 54.5%
Angry 45.3%
Surprised 45.1%
Sad 49%
Calm 46.9%
Confused 45.2%
Happy 48.4%
Disgusted 45.1%

AWS Rekognition

Age 26-43
Gender Male, 53.1%
Sad 45.5%
Surprised 45.1%
Angry 45.4%
Disgusted 45.1%
Confused 45.1%
Calm 53.8%
Happy 45%

AWS Rekognition

Age 30-47
Gender Female, 54.4%
Confused 45.2%
Angry 45.3%
Surprised 45.2%
Sad 48.3%
Disgusted 45.2%
Happy 45.1%
Calm 50.6%

Feature analysis

Amazon

Person 99.5%
Chair 79.8%
Shoe 68%

Categories