Human Generated Data

Title

Untitled (couple dancing in hallway)

Date

c. 1950

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19421

Human Generated Data

Title

Untitled (couple dancing in hallway)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 98.1
Apparel 98.1
Person 98
Human 98
Person 93.2
Shoe 81.6
Footwear 81.6
Person 80.7
Coat 75.6
Suit 66.9
Overcoat 66.9
Doctor 64.4
Door 64.3
Clinic 61.4
Chair 59
Furniture 59

Imagga
created on 2022-03-05

megaphone 35.2
acoustic device 28.5
device 28.4
man 26.9
person 26.1
adult 25.9
male 24.1
people 24
fashion 21.1
men 19.7
portrait 18.1
black 17.4
business 17
urban 16.6
model 16.3
professional 16.1
city 15.8
attractive 14.7
suit 14.5
elegance 14.3
equipment 13.6
human 13.5
style 13.3
posing 13.3
pretty 13.3
businessman 13.2
standing 13
corporate 12.9
street 12.9
exercise 11.8
club 11.3
hair 11.1
dress 10.8
sports equipment 10.5
building 10.3
motion 10.3
office 9.8
handsome 9.8
clothing 9.8
lady 9.7
job 9.7
one 9.7
summer 9.6
sexy 9.6
sport 9.6
body 9.6
women 9.5
happy 9.4
action 9.3
modern 9.1
dumbbell 9.1
sensuality 9.1
life 9
outdoors 9
weight 8.8
call 8.7
lifestyle 8.7
work 8.6
cute 8.6
wall 8.6
face 8.5
worker 8.5
legs 8.5
outdoor 8.4
manager 8.4
occupation 8.2
competition 8.2
pose 8.2
stylish 8.1
pay-phone 8
telephone 8
looking 8
cool 8
ball 8
jacket 7.9
day 7.8
brunette 7.8
mask 7.8
elegant 7.7
outside 7.7
industry 7.7
casual 7.6
hand 7.6
fashionable 7.6
holding 7.4
safety 7.4
alone 7.3
playing 7.3
fitness 7.2
smile 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 97.3
black and white 94
person 86.8
street 84.6
footwear 82.6
clothing 72.4
monochrome 57.1

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 84.6%
Calm 42%
Sad 20.4%
Confused 10.4%
Happy 9.1%
Surprised 7.8%
Disgusted 4.2%
Fear 4.2%
Angry 1.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98%
Shoe 81.6%

Captions

Microsoft

a person standing in front of a building 68.3%
a person that is standing in front of a building 63.7%
a person standing in front of a building 63.6%

Text analysis

Amazon

F
MJI7
و
A°2
MAGOX
MJI7 YT37A2 MAGOX
MAGOM
MJIR YT37 A°2 MAGOM
MJIR
YT37
YT37A2

Google

2.vE
EIA
KODYM KODYK 2.vE EIA Eirw
KODYM
Eirw
KODYK