Human Generated Data

Title

Untitled (seven women dressed up with hats and bags posed in gymnasium)

Date

1945-1950

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6501

Human Generated Data

Title

Untitled (seven women dressed up with hats and bags posed in gymnasium)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1945-1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6501

Machine Generated Data

Tags

Amazon
created on 2019-03-22

Human 99.5
Person 99.5
Person 99.3
Person 98.9
Person 98.9
Clothing 98.8
Apparel 98.8
Person 98.6
Person 98.6
Person 98.1
Dress 96
Female 86.7
Furniture 79.5
Chair 78.7
Coat 76.9
Suit 76.9
Overcoat 76.9
Floor 70.4
Indoors 69.4
Girl 69
Woman 66.7
People 65.6
Room 64.4
Hair 63.6
Child 62.3
Kid 62.3
Advertisement 59.2
Shorts 58.9
Poster 57.2

Clarifai
created on 2019-03-22

people 99.9
group 99.4
group together 98.8
adult 97.1
woman 96.6
many 96.1
wear 95.3
man 92.2
music 89.4
several 86.4
leader 84.8
administration 82.8
outfit 82.6
education 79.8
musician 78.2
five 77
partnership 76.4
dancer 70.8
room 70.6
dancing 70

Imagga
created on 2019-03-22

singer 43.6
person 38.2
musician 37.8
people 36.2
businessman 35.3
man 32.2
business 31
male 30.5
professional 30.1
men 29.2
group 29
performer 28.5
adult 26.8
wind instrument 24.1
corporate 24
brass 22.4
teacher 22.3
work 22
room 20.7
women 20.5
classroom 20
team 19.7
job 19.5
musical instrument 19.3
meeting 18.8
entertainer 18.8
teamwork 18.5
office 17.7
executive 17.6
happy 16.9
worker 15.6
outfit 15.4
businesswoman 15.4
modern 15.4
suit 15.3
success 15.3
smiling 15.2
smile 14.2
manager 14
together 13.1
educator 12.8
portrait 12.3
couple 12.2
boy 12.2
standing 12.2
education 12.1
handsome 11.6
black 11.4
businesspeople 11.4
communication 10.9
student 10.8
indoors 10.5
boss 10.5
youth 10.2
lifestyle 10.1
successful 10.1
laptop 10
confident 10
holding 9.9
to 9.7
diversity 9.6
crowd 9.6
looking 9.6
golfer 9.3
casual 9.3
two 9.3
company 9.3
finance 9.3
girls 9.1
silhouette 9.1
attractive 9.1
cute 8.6
staff 8.6
ethnic 8.6
employee 8.4
player 8.3
human 8.2
fun 8.2
board 8.1
interior 8
conference 7.8
table 7.8
colleagues 7.8
employment 7.7
leader 7.7
pretty 7.7
formal 7.6
studio 7.6
chair 7.6
career 7.6
active 7.5
study 7.5
style 7.4
teen 7.3
children 7.3
stage 7.2
happiness 7

Google
created on 2019-03-22

Microsoft
created on 2019-03-22

wall 97.9
standing 75.8
posing 73.9
old 73.3
black 69.5
white 63.6
ballet 63.6
dancing 5.5
person 3.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Male, 54.9%
Angry 45.2%
Calm 47.4%
Confused 45.3%
Disgusted 45.2%
Happy 51.1%
Sad 45.4%
Surprised 45.3%

AWS Rekognition

Age 26-43
Gender Female, 52.5%
Angry 45.2%
Sad 45.6%
Happy 45.3%
Disgusted 45.1%
Calm 53.5%
Confused 45.2%
Surprised 45.1%

AWS Rekognition

Age 26-43
Gender Male, 52.9%
Calm 47.6%
Sad 45.6%
Disgusted 45.1%
Surprised 45.3%
Confused 45.1%
Happy 51%
Angry 45.3%

AWS Rekognition

Age 26-43
Gender Male, 53.5%
Happy 52.2%
Sad 45.2%
Surprised 45.1%
Disgusted 45%
Angry 45.1%
Confused 45%
Calm 47.2%

AWS Rekognition

Age 26-43
Gender Male, 54.8%
Happy 52.3%
Sad 45.3%
Calm 46.9%
Confused 45.1%
Angry 45.2%
Disgusted 45.1%
Surprised 45.2%

AWS Rekognition

Age 26-43
Gender Female, 53.9%
Confused 45.1%
Angry 45.2%
Surprised 45.1%
Sad 45.3%
Happy 50.7%
Disgusted 45.1%
Calm 48.6%

AWS Rekognition

Age 20-38
Gender Male, 53.2%
Disgusted 45.4%
Happy 48.4%
Calm 47.5%
Sad 46.4%
Angry 45.6%
Confused 45.7%
Surprised 46%

Feature analysis

Amazon

Person 99.5%

Categories

Text analysis

Amazon

I1