Human Generated Data

Title

Untitled (couple dancing at ball)

Date

1965

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19297

Human Generated Data

Title

Untitled (couple dancing at ball)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19297

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 98.8
Human 98.8
Person 96.6
Person 95.8
Person 94.4
Plant 92.8
Flower 91
Blossom 91
Suit 89.9
Coat 89.9
Overcoat 89.9
Clothing 89.9
Apparel 89.9
Flower Arrangement 87.9
Flower Bouquet 87.4
Person 72.8
Suit 68.5
Interior Design 64.7
Indoors 64.7
Floor 57.1

Clarifai
created on 2023-10-22

people 99.3
wedding 98
woman 97.9
portrait 97.7
love 96.4
man 96
family 95.6
dress 94.5
wear 94.4
groom 93.5
girl 93.1
adult 93
fashion 92.9
two 92.3
indoors 92.2
bride 91.2
doorway 89.7
couple 88.2
room 88
facial expression 85.2

Imagga
created on 2022-02-25

robe 50.1
garment 43.1
clothing 37.3
man 33.6
people 27.9
male 25.9
person 25.8
business 24.3
adult 23.6
professional 23.1
corporate 21.5
standing 20.9
dress 19.9
attractive 19.6
office 19.5
happy 19.4
pretty 18.2
fashion 17.3
portrait 16.8
businessman 16.8
women 16.6
black 15.8
work 15.7
covering 15.3
couple 14.8
lifestyle 14.5
two 14.4
smile 14.3
interior 14.2
suit 13.8
smiling 13.7
looking 13.6
happiness 13.3
consumer goods 13.3
lady 13
building 12.8
businesswoman 12.7
life 12.7
modern 12.6
groom 12.3
door 12.3
men 12
holding 11.6
indoors 11.4
group 11.3
communication 10.9
team 10.8
executive 10.7
cheerful 10.6
success 10.5
casual 10.2
shop 9.8
job 9.7
style 9.6
room 9.6
hands 9.6
career 9.5
meeting 9.4
clothes 9.4
student 9.3
child 9.2
indoor 9.1
jacket 8.9
boy 8.7
diversity 8.7
customer 8.6
model 8.6
businesspeople 8.5
youth 8.5
teacher 8.4
teamwork 8.3
confident 8.2
handsome 8
love 7.9
brunette 7.8
workers 7.8
full length 7.8
ethnic 7.6
finance 7.6
elegance 7.6
human 7.5
successful 7.3
friendly 7.3
home 7.2
worker 7.1

Google
created on 2022-02-25

Product 90.7
Plant 89.1
Sleeve 86.8
Picture frame 85.9
Yellow 85.8
Gesture 85.3
Smile 84.7
Door 76.2
Rectangle 74.9
Event 73.7
Formal wear 73.5
Font 72.9
Flower 72
Curtain 68.7
Happy 67.2
Flash photography 66.5
Room 65.8
Monochrome photography 65.5
Stock photography 62.5
Monochrome 60.4

Microsoft
created on 2022-02-25

text 98.2
standing 93.3
person 89.2
clothing 86.4
woman 81.8
flower 81.2
posing 74.4
smile 63.5
dress 59
suit 58.3
picture frame 6.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-43
Gender Male, 99.8%
Happy 57.8%
Sad 15.5%
Surprised 6.4%
Disgusted 6.1%
Calm 4.8%
Fear 3.4%
Angry 3.1%
Confused 2.9%

AWS Rekognition

Age 23-33
Gender Female, 100%
Happy 99.6%
Fear 0.1%
Surprised 0.1%
Angry 0.1%
Disgusted 0.1%
Sad 0%
Confused 0%
Calm 0%

AWS Rekognition

Age 43-51
Gender Male, 99.9%
Angry 98.3%
Calm 1.2%
Sad 0.3%
Fear 0%
Surprised 0%
Disgusted 0%
Confused 0%
Happy 0%

Microsoft Cognitive Services

Age 34
Gender Female

Microsoft Cognitive Services

Age 38
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Suit
Person 98.8%
Person 96.6%
Person 95.8%
Person 94.4%
Person 72.8%
Suit 89.9%
Suit 68.5%

Text analysis

Amazon

123
65
JAN
132

Google

132 ६ JAN 65
132
JAN
65