Human Generated Data

Title

Untitled (couples on dance floor)

Date

1965

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19311

Human Generated Data

Title

Untitled (couples on dance floor)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1965

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Human 99.5
Person 99.5
Leisure Activities 98.9
Dance Pose 98.9
Person 97.6
Person 97
Person 95.3
Shoe 94.5
Apparel 94.5
Footwear 94.5
Clothing 94.5
Shoe 89.5
Dance 89.2
Home Decor 88.7
Tango 85.8
Person 83.2
Text 75.6
Coat 74.7
Overcoat 74.7
Suit 74.7
People 59.2
Floor 58.7
Door 57.5
Flooring 57.2
Photo 57.1
Photography 57.1

Imagga
created on 2022-02-25

groom 41.1
man 37
people 32.4
male 31.3
person 29.3
business 27.3
corporate 26.6
businessman 26.5
adult 26.1
professional 24.9
locker 24
office 21.8
suit 21.7
standing 20
men 19.8
fastener 19.1
building 17.5
couple 17.4
work 17.3
black 17.1
portrait 16.8
women 16.6
dress 16.3
device 15
restraint 14.5
happy 14.4
two 14.4
fashion 14.3
group 13.7
executive 13.6
looking 13.6
handsome 13.4
clothing 13.3
job 13.3
success 12.9
life 12.6
lifestyle 12.3
meeting 12.3
casual 11.9
tie 11.4
urban 11.4
adults 11.4
attractive 11.2
call 11
communication 10.9
city 10.8
team 10.8
holding 10.7
pretty 10.5
manager 10.2
jacket 10.2
style 9.6
career 9.5
happiness 9.4
smiling 9.4
model 9.3
smile 9.3
occupation 9.2
indoor 9.1
modern 9.1
student 9.1
window 8.8
shop 8.7
room 8.7
talking 8.6
teacher 8.5
bow tie 8.5
company 8.4
phone 8.3
one 8.2
businesswoman 8.2
cheerful 8.1
garment 8
to 8
interior 8
indoors 7.9
love 7.9
day 7.8
hands 7.8
full length 7.8
corporation 7.7
youth 7.7
boss 7.7
businesspeople 7.6
desk 7.6
human 7.5
sale 7.4
inside 7.4
successful 7.3
alone 7.3
worker 7.2
family 7.1

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 98.3
wall 97.7
suit 95.1
clothing 90.9
person 85.8
man 82.8
standing 75.2
handwriting 72.1
dress 64.9
smile 63
footwear 60.9
poster 59.7
posing 51.9

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 33-41
Gender Female, 100%
Happy 99.7%
Surprised 0.2%
Angry 0%
Fear 0%
Confused 0%
Calm 0%
Disgusted 0%
Sad 0%

AWS Rekognition

Age 54-64
Gender Male, 99.8%
Happy 98.6%
Surprised 0.4%
Fear 0.3%
Confused 0.2%
Angry 0.2%
Sad 0.1%
Disgusted 0.1%
Calm 0%

AWS Rekognition

Age 41-49
Gender Male, 99.9%
Happy 89.9%
Sad 3.2%
Angry 2%
Calm 1.8%
Disgusted 1.4%
Fear 0.6%
Confused 0.6%
Surprised 0.5%

Microsoft Cognitive Services

Age 58
Gender Male

Microsoft Cognitive Services

Age 42
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Shoe 94.5%

Captions

Microsoft

a group of people posing for the camera 90.7%
a group of people posing for a picture 90.6%
a group of people posing for a photo 86.7%

Text analysis

Amazon

65
194
JAN
132

Google

132
65
194 132 JAN 65
194
JAN