Human Generated Data

Title

Untitled (group portrait of men and women dressed up)

Date

c. 1945

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19158

Human Generated Data

Title

Untitled (group portrait of men and women dressed up)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19158

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 100
Apparel 100
Person 98.7
Human 98.7
Suit 98.6
Overcoat 98.6
Coat 98.6
Person 97.6
Person 97
Person 96.6
Person 96.3
Person 95.9
Person 95.7
Person 94.5
Dress 94.2
Person 94
Evening Dress 93.7
Robe 93.7
Gown 93.7
Fashion 93.7
Person 92.9
Person 92.5
Person 91.4
Person 90.9
Tuxedo 90.4
Person 90.1
Female 86.5
Person 85
Person 81.8
Person 80.5
Wedding 77.7
People 73.3
Woman 69.2
Wedding Gown 65.7
Face 62.7
Portrait 62.2
Photography 62.2
Photo 62.2

Clarifai
created on 2023-10-22

people 99.9
group 99.3
adult 98.6
man 98.3
woman 97.4
wear 96.6
group together 95.2
education 94.1
many 93.4
dress 89.6
outfit 88.8
monochrome 87.6
actor 87.3
wedding 86.9
music 86.2
portrait 85.4
leader 84.3
school 81.7
dinner jacket 81.6
veil 80.4

Imagga
created on 2022-03-05

people 39.6
man 37.6
male 35.4
businessman 34.4
person 32.7
team 31.3
teamwork 30.6
silhouette 29.8
business 28.5
crowd 26.9
work 25.9
group 24.2
men 23.2
groom 22.3
singer 22.2
outfit 20.5
boss 20.1
musician 19.7
job 19.4
couple 19.1
businesswoman 18.2
dress 18.1
adult 17.8
occupation 17.4
professional 17.2
performer 16.4
suit 16
women 15.8
design 15.7
brass 15.4
success 15.3
happy 15
love 15
presentation 14.9
president 14.7
wind instrument 14.7
speech 13.7
sexy 13.6
black 13.4
nation 13.2
vibrant 13.1
vivid 13
cornet 13
flag 12.8
nighttime 12.7
leader 12.5
meeting 12.2
corporate 12
supporters 11.8
cheering 11.7
audience 11.7
stadium 11.7
patriotic 11.5
bright 11.4
entertainer 11.4
marriage 11.4
bride 11.3
lights 11.1
musical instrument 10.9
symbol 10.8
workers 10.7
together 10.5
happiness 10.2
smiling 10.1
wedding 10.1
staff 9.7
icon 9.5
executive 9.5
youth 9.4
smile 9.3
portrait 9.1
shadow 9
handsome 8.9
office 8.8
standing 8.7
party 8.6
bouquet 8.5
teacher 8.5
celebration 8
colleagues 7.8
attendant 7.7
attractive 7.7
two 7.6
dance 7.6
tie 7.6
human 7.5
manager 7.4
company 7.4
confident 7.3
family 7.1
night 7.1
kin 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

dress 96.7
person 95.6
text 95.5
clothing 92.7
woman 88.1
standing 84
wedding dress 75.8
black and white 75.3
suit 71.3
group 67.6
white 64.9
posing 63.7
people 57.6
bride 50.6
clothes 28

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 74.4%
Calm 98%
Happy 0.9%
Surprised 0.7%
Sad 0.2%
Confused 0.1%
Disgusted 0.1%
Fear 0%
Angry 0%

AWS Rekognition

Age 47-53
Gender Male, 99.5%
Calm 75.8%
Happy 18.3%
Sad 3.7%
Fear 0.7%
Disgusted 0.5%
Confused 0.4%
Surprised 0.4%
Angry 0.2%

AWS Rekognition

Age 40-48
Gender Male, 98.3%
Calm 99.6%
Happy 0.2%
Surprised 0.1%
Confused 0%
Disgusted 0%
Sad 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 29-39
Gender Male, 99.8%
Calm 98%
Sad 0.7%
Confused 0.5%
Fear 0.2%
Surprised 0.2%
Disgusted 0.2%
Angry 0.2%
Happy 0.1%

AWS Rekognition

Age 49-57
Gender Male, 96.2%
Calm 85.1%
Happy 8.1%
Sad 5.8%
Confused 0.3%
Disgusted 0.3%
Surprised 0.1%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 38-46
Gender Female, 97%
Calm 94.9%
Happy 2.9%
Sad 1.2%
Confused 0.3%
Surprised 0.3%
Disgusted 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 45-53
Gender Male, 78.7%
Happy 93.3%
Sad 2.5%
Calm 1.4%
Fear 1.3%
Surprised 0.4%
Disgusted 0.4%
Confused 0.4%
Angry 0.3%

AWS Rekognition

Age 31-41
Gender Male, 71.7%
Happy 72.7%
Calm 25.5%
Fear 0.4%
Sad 0.4%
Surprised 0.4%
Disgusted 0.3%
Confused 0.2%
Angry 0.1%

AWS Rekognition

Age 38-46
Gender Male, 99.9%
Sad 94%
Calm 3.4%
Fear 1.3%
Confused 0.5%
Surprised 0.3%
Happy 0.2%
Disgusted 0.2%
Angry 0.1%

AWS Rekognition

Age 49-57
Gender Male, 99.9%
Calm 92.1%
Sad 3.7%
Disgusted 1%
Confused 1%
Angry 0.8%
Surprised 0.7%
Happy 0.4%
Fear 0.4%

AWS Rekognition

Age 41-49
Gender Male, 95.9%
Calm 90.2%
Sad 3%
Confused 2.5%
Happy 2.1%
Fear 0.9%
Disgusted 0.5%
Surprised 0.4%
Angry 0.4%

AWS Rekognition

Age 43-51
Gender Male, 78.7%
Calm 97.2%
Sad 1%
Confused 0.5%
Disgusted 0.4%
Happy 0.4%
Angry 0.2%
Surprised 0.2%
Fear 0.2%

AWS Rekognition

Age 38-46
Gender Male, 99.6%
Calm 98.8%
Surprised 0.9%
Happy 0.1%
Sad 0.1%
Confused 0%
Fear 0%
Disgusted 0%
Angry 0%

AWS Rekognition

Age 45-53
Gender Male, 91.4%
Calm 97.9%
Sad 1.5%
Disgusted 0.2%
Happy 0.2%
Fear 0.1%
Confused 0.1%
Angry 0.1%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.7%
Person 97.6%
Person 97%
Person 96.6%
Person 96.3%
Person 95.9%
Person 95.7%
Person 94.5%
Person 94%
Person 92.9%
Person 92.5%
Person 91.4%
Person 90.9%
Person 90.1%
Person 85%
Person 81.8%
Person 80.5%

Categories

Text analysis

Amazon

G
ОО