Human Generated Data

Title

Untitled (four people at ball)

Date

1965

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19283

Human Generated Data

Title

Untitled (four people at ball)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19283

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Clothing 98.9
Apparel 98.9
Person 97
Human 97
Person 96.8
Person 96.6
Person 96.5
Suit 88.7
Overcoat 88.7
Coat 88.7
Long Sleeve 80.4
Sleeve 80.4
Text 75.1
Robe 70.8
Fashion 70.8
Gown 70.5
Advertisement 69.2
People 69.1
Shirt 67.2
Poster 66
Female 64.6
Tuxedo 64.4
Evening Dress 59.2
Art 57.8
Wedding 57.7
Woman 56.3
Floor 55.9

Clarifai
created on 2023-10-22

people 99.7
wedding 98.9
man 98.3
woman 97.8
bride 97.7
groom 97.3
family 96.4
group 95.7
dress 95.7
adult 94.3
dinner jacket 89.8
wear 89.6
veil 87.9
actor 86.5
portrait 86.4
two 84
three 81.9
room 81.8
retro 81.3
actress 80.8

Imagga
created on 2022-02-25

business 47.4
professional 45.5
man 41.7
corporate 41.3
office 41
businessman 39.8
male 37.6
suit 35.3
people 33.5
building 31.6
team 30.5
work 29.8
person 27.7
meeting 27.3
businesswoman 27.3
happy 25.7
job 25.7
adult 25.5
success 24.2
executive 24.1
successful 23.8
group 23.4
teamwork 22.3
smile 21.4
women 21.4
attractive 20.3
men 19.8
manager 19.6
black 19.2
worker 19.1
teacher 18.8
diverse 18.6
life 17.6
diversity 17.3
groom 17.1
occupation 16.5
boss 16.3
ethnic 16.2
communication 16
company 15.8
businesspeople 15.2
career 15.2
pretty 14.7
working 14.2
partnership 13.5
talking 13.3
holding 13.2
portrait 12.3
standing 12.2
happiness 11.8
educator 11.7
colleagues 11.7
smiling 11.6
education 11.3
modern 11.2
looking 11.2
youth 11.1
confident 10.9
student 10.9
businessmen 10.7
tie 10.4
outside 10.3
two 10.2
phone 10.1
laptop 10
conference 9.8
human 9.8
lady 9.7
workers 9.7
partner 9.7
corporation 9.7
formal 9.6
university 9.3
fashion 9.1
employee 8.7
college 8.7
lifestyle 8.7
day 8.6
confidence 8.6
workplace 8.6
briefcase 8.5
presentation 8.4
clothing 8.3
20s 8.3
computer 8.2
jacket 8.1
handsome 8
couple 7.8
handshake 7.8
40s 7.8
discussion 7.8
sales 7.7
leadership 7.7
profession 7.7
adults 7.6
technology 7.4
cell 7.4
garment 7.4
school 7.2
architecture 7

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 99.1
wedding dress 98.9
bride 97.9
dress 92.8
person 88.6
wedding 88.5
suit 81.6
clothing 79
gallery 78.7
woman 78
room 62.6
posing 55.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-30
Gender Female, 91.5%
Calm 70.5%
Confused 18.2%
Happy 6.7%
Surprised 1.8%
Angry 1%
Disgusted 0.7%
Fear 0.6%
Sad 0.6%

AWS Rekognition

Age 16-24
Gender Male, 99.5%
Happy 100%
Surprised 0%
Calm 0%
Confused 0%
Angry 0%
Fear 0%
Disgusted 0%
Sad 0%

AWS Rekognition

Age 21-29
Gender Female, 99.9%
Happy 94.6%
Disgusted 1.2%
Angry 1.1%
Calm 1%
Surprised 0.7%
Sad 0.6%
Confused 0.5%
Fear 0.4%

AWS Rekognition

Age 23-31
Gender Female, 99.1%
Calm 95%
Confused 1.9%
Surprised 0.9%
Angry 0.7%
Sad 0.5%
Disgusted 0.4%
Happy 0.3%
Fear 0.3%

Microsoft Cognitive Services

Age 27
Gender Male

Microsoft Cognitive Services

Age 30
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 97%
Person 96.8%
Person 96.6%
Person 96.5%

Categories

Text analysis

Amazon

JAN
65
128
132

Google

JAN 65 DEXIU 128 132
JAN
65
DEXIU
128
132