Human Generated Data

Title

Untitled (group portrait of twelve young men and women sitting in front of curtain)

Date

1920-1940, printed later

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11169

Human Generated Data

Title

Untitled (group portrait of twelve young men and women sitting in front of curtain)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

1920-1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11169

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Tie 99.8
Accessories 99.8
Accessory 99.8
Person 99
Human 99
Person 98
Person 97.7
Person 97.5
Person 97.4
Person 97.3
Person 97
Person 97
Apparel 96.9
Shoe 96.9
Clothing 96.9
Footwear 96.9
Person 96.5
Person 96.2
Shoe 95.3
Person 93.2
Shoe 92.7
Suit 92
Coat 92
Overcoat 92
People 91.6
Person 91.5
Tie 89.5
Suit 89
Crowd 88.4
Suit 87.1
Sitting 85.6
Skin 85.5
Tie 83
Shoe 79.9
Tie 78.4
Suit 74.9
Suit 71
Jury 67.3
Attorney 66.8
Photography 60.6
Photo 60.6
Press Conference 60.5
Portrait 60.4
Face 60.4
Family 57.5
Stage 56.5
Room 56.2
Indoors 56.2
Performer 55.7

Clarifai
created on 2019-11-16

people 99.8
group 99.7
many 99.1
group together 98.6
adult 95.9
woman 95.4
leader 95.3
wear 94.4
administration 94.1
man 92.9
five 89.8
several 89.4
outfit 87.6
military 86.3
chair 82.5
portrait 82
music 79.1
uniform 79.1
child 77.9
musician 77

Imagga
created on 2019-11-16

man 39
groom 36.1
people 35.7
male 34.7
kin 34
person 33.9
businessman 30.9
business 26.7
professional 26.6
adult 26.5
happy 25.1
couple 24.4
group 24.2
team 19.7
happiness 19.6
men 18.9
smiling 18.8
teamwork 18.5
office 18.5
corporate 18
meeting 17
suit 16.2
portrait 16.2
job 15.9
executive 15.8
women 15
together 14.9
businesswoman 14.5
success 14.5
work 14.1
black 14
attractive 14
standing 13.9
lifestyle 13
love 12.6
family 12.5
businesspeople 12.3
worker 12.1
smile 12.1
dress 11.7
room 11.7
handsome 11.6
performer 11.4
manager 11.2
sitting 11.2
two 11
bride 10.8
modern 10.5
teacher 10.2
successful 10.1
dark 10
wind instrument 9.7
brass 9.6
boss 9.6
career 9.5
youth 9.4
wedding 9.2
silhouette 9.1
fashion 9
romantic 8.9
style 8.9
dancer 8.9
working 8.8
looking 8.8
colleagues 8.7
boy 8.7
staff 8.6
joy 8.4
human 8.2
fun 8.2
sexy 8
dance 8
workers 7.8
partner 7.7
pretty 7.7
partnership 7.7
marriage 7.6
hand 7.6
bouquet 7.5
friendship 7.5
company 7.4
holding 7.4
lady 7.3
new 7.3
building 7.2
home 7.2
night 7.1
indoors 7

Google
created on 2019-11-16

Photograph 96.2
People 94.9
Snapshot 83.3
Black-and-white 68.3
Family 66.1
Gentleman 63.5
Photography 62.4
Classic 56.7
History 54.1

Microsoft
created on 2019-11-16

person 99.8
wall 99.2
posing 98.7
clothing 98.5
smile 97
man 92.6
group 91
standing 90.9
text 89.7
suit 82.6
old 80.4
black 79.8
footwear 58.4
clothes 18.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 13-23
Gender Female, 54.1%
Happy 45%
Calm 54.8%
Disgusted 45%
Surprised 45%
Fear 45%
Sad 45%
Confused 45%
Angry 45.1%

AWS Rekognition

Age 22-34
Gender Male, 53.9%
Disgusted 45%
Sad 45%
Happy 45%
Surprised 45%
Calm 54.9%
Fear 45%
Confused 45%
Angry 45%

AWS Rekognition

Age 15-27
Gender Male, 54.1%
Angry 45.1%
Sad 45%
Happy 45%
Disgusted 45%
Calm 54.8%
Confused 45%
Surprised 45%
Fear 45%

AWS Rekognition

Age 17-29
Gender Male, 54.8%
Fear 45%
Happy 45%
Calm 54.9%
Surprised 45%
Disgusted 45%
Angry 45%
Sad 45%
Confused 45%

AWS Rekognition

Age 11-21
Gender Male, 50.2%
Calm 54.4%
Disgusted 45.2%
Sad 45.1%
Happy 45.1%
Angry 45.1%
Fear 45%
Surprised 45.1%
Confused 45.1%

AWS Rekognition

Age 19-31
Gender Male, 52.4%
Disgusted 45%
Fear 45%
Surprised 45%
Angry 45%
Calm 54.7%
Happy 45%
Confused 45.1%
Sad 45.1%

AWS Rekognition

Age 16-28
Gender Male, 50.2%
Calm 51.6%
Sad 45.2%
Happy 45%
Angry 48.2%
Surprised 45%
Confused 45%
Disgusted 45%
Fear 45%

AWS Rekognition

Age 18-30
Gender Male, 54.6%
Disgusted 45%
Angry 45%
Confused 45.1%
Calm 54.7%
Sad 45%
Happy 45%
Surprised 45%
Fear 45%

AWS Rekognition

Age 19-31
Gender Male, 53.2%
Calm 54.6%
Sad 45.1%
Angry 45.1%
Disgusted 45%
Happy 45%
Surprised 45%
Fear 45%
Confused 45.2%

AWS Rekognition

Age 21-33
Gender Female, 54.3%
Calm 53.5%
Sad 45.1%
Fear 45.1%
Confused 45.2%
Angry 45.5%
Happy 45.1%
Disgusted 45.2%
Surprised 45.2%

AWS Rekognition

Age 12-22
Gender Female, 53.4%
Fear 45%
Angry 45%
Sad 45%
Happy 45%
Calm 55%
Confused 45%
Surprised 45%
Disgusted 45%

AWS Rekognition

Age 13-25
Gender Male, 53.7%
Fear 45%
Sad 45%
Confused 45%
Angry 45%
Happy 45%
Calm 54.9%
Surprised 45%
Disgusted 45%

Microsoft Cognitive Services

Age 28
Gender Male

Microsoft Cognitive Services

Age 23
Gender Male

Microsoft Cognitive Services

Age 23
Gender Female

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 26
Gender Male

Microsoft Cognitive Services

Age 23
Gender Female

Microsoft Cognitive Services

Age 23
Gender Female

Microsoft Cognitive Services

Age 28
Gender Male

Microsoft Cognitive Services

Age 38
Gender Male

Microsoft Cognitive Services

Age 41
Gender Male

Microsoft Cognitive Services

Age 29
Gender Male

Microsoft Cognitive Services

Age 29
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Tie 99.8%
Person 99%
Shoe 96.9%
Suit 92%

Categories

Imagga

events parties 53.5%
people portraits 46.3%