Human Generated Data

Title

Untitled (studio portrait of woman in black dress sitting on arm of chair)

Date

c. 1905-1910, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5978

Human Generated Data

Title

Untitled (studio portrait of woman in black dress sitting on arm of chair)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1910, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5978

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.4
Human 99.4
Person 99
Sitting 98.1
Person 98
Person 97.5
Clothing 95.3
Apparel 95.3
Chair 92.2
Furniture 92.2
Performer 85.4
Overcoat 75.8
Coat 75.8
Suit 75.8
Crowd 73.9
Face 68.2
Leisure Activities 63.5
Flooring 62.1
Female 60.6
Footwear 59.7
Shoe 59.7
Woman 59.6
Teen 59.6
Kid 59.6
Blonde 59.6
Child 59.6
Girl 59.6
Audience 57.7
Musician 57.2
Musical Instrument 57.2
Shoe 57.2

Clarifai
created on 2019-11-16

people 99.8
man 97.5
group 97.1
woman 96.8
adult 94.8
music 92.8
room 92.2
stage 91.9
movie 90.7
indoors 90.4
group together 89
child 87.9
actor 87
family 85.4
theater 84.5
musician 84.5
three 84.1
outfit 81.5
portrait 80.9
chair 80.1

Imagga
created on 2019-11-16

man 33.6
business 31
people 29
businessman 23.8
male 23.4
person 22.8
office 22.7
adult 22.4
spectator 22.3
men 19.7
indoor 18.3
corporate 18
suit 16.6
group 16.1
fashion 15.8
black 15.4
room 14.9
indoors 14.1
executive 13.6
window 13.5
clothing 13.2
looking 12.8
businesswoman 12.7
dark 11.7
handsome 11.6
corporation 11.6
couple 11.3
women 11.1
chair 11
happiness 11
silhouette 10.8
interior 10.6
kin 10.6
attractive 10.5
businesspeople 10.4
outfit 10.4
meeting 10.4
sitting 10.3
professional 10.2
face 9.9
posing 9.8
computer 9.8
style 9.6
career 9.5
laptop 9.5
casual 9.3
city 9.1
portrait 9.1
success 8.8
job 8.8
working 8.8
happy 8.8
urban 8.7
building 8.7
light 8.7
lifestyle 8.7
work 8.6
wall 8.5
desk 8.5
clothes 8.4
communication 8.4
manager 8.4
world 8.2
jacket 8.1
love 7.9
elegant 7.7
old 7.7
boss 7.6
life 7.4
military uniform 7.3
body 7.2
home 7.2
team 7.2
employee 7.1
architecture 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

clothing 97.4
wall 96.4
person 96.3
footwear 91
black and white 88.2
text 84.7
man 67.6
black 66.7
white 63
woman 60.4
coat 50.3
posing 45.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-35
Gender Male, 53.3%
Calm 54%
Disgusted 45%
Happy 45.7%
Fear 45%
Confused 45.1%
Angry 45.1%
Surprised 45%
Sad 45.1%

AWS Rekognition

Age 27-43
Gender Male, 51.3%
Disgusted 45%
Confused 45%
Angry 45%
Sad 45%
Happy 45%
Surprised 45%
Calm 54.8%
Fear 45%

AWS Rekognition

Age 17-29
Gender Female, 51.8%
Happy 45%
Angry 45%
Confused 45%
Calm 54.8%
Disgusted 45%
Fear 45%
Surprised 45%
Sad 45.1%

AWS Rekognition

Age 5-15
Gender Female, 54.7%
Angry 45%
Confused 45.3%
Disgusted 45%
Happy 45%
Sad 45.1%
Fear 45.3%
Calm 52.8%
Surprised 46.5%

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 33
Gender Female

Microsoft Cognitive Services

Age 20
Gender Female

Microsoft Cognitive Services

Age 33
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Shoe 59.7%

Categories

Imagga

interior objects 95.4%
food drinks 4.3%