Human Generated Data

Title

Untitled (two men standing outside building, wearing hats and one holding a rope)

Date

c. 1935

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21839

Human Generated Data

Title

Untitled (two men standing outside building, wearing hats and one holding a rope)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

c. 1935

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Apparel 99.9
Clothing 99.9
Human 99.8
Person 99.8
Person 99.3
Hat 76.9
Face 68.5
Female 66.5
Outdoors 66.5
Nature 64.5
Overcoat 63.5
Coat 63.5
Sun Hat 62.1
Photography 61.4
Photo 61.4
Portrait 61.4
Cap 60
Sleeve 59.3
Pants 57.5
Shorts 55.6
Machine 55.3
Spoke 55.3

Imagga
created on 2022-03-11

device 28.9
elevator 24.5
old 20.9
lifting device 19.6
people 19
fashion 18.1
call 16.2
person 15.8
adult 15.6
attractive 15.4
black 15
man 14.8
architecture 14.1
wall 13.7
portrait 12.9
sexy 12.8
window 12.8
posing 12.4
lady 12.2
pretty 11.9
male 11.3
one 11.2
house 10.9
city 10.8
building 10.8
telephone 10.8
interior 10.6
ancient 10.4
home 10.4
guillotine 10.1
pay-phone 10.1
happy 10
sensual 10
history 9.8
human 9.7
urban 9.6
face 9.2
dark 9.2
vintage 9.1
dress 9
locker 9
instrument of execution 8.9
hair 8.7
light 8.7
elegance 8.4
sensuality 8.2
style 8.2
machine 8
cute 7.9
brunette 7.8
door 7.8
grunge 7.7
casual 7.6
town 7.4
emotion 7.4
fastener 7.4
instrument 7.3
figure 7.3
alone 7.3
art 7.2
looking 7.2
body 7.2
room 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

clothing 96.7
black and white 96.4
person 95.1
text 93.8
man 90.1
street 89.6
monochrome 79
human face 77.8
posing 40.5

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Male, 97.6%
Calm 99.9%
Disgusted 0%
Surprised 0%
Confused 0%
Sad 0%
Angry 0%
Fear 0%
Happy 0%

AWS Rekognition

Age 36-44
Gender Male, 99.9%
Calm 29.1%
Angry 22.2%
Disgusted 20.5%
Sad 14.3%
Confused 7.9%
Surprised 2.9%
Happy 2.1%
Fear 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a man and a woman posing for a picture 57.6%
a man and woman posing for a picture 51.2%
a person standing posing for the camera 51.1%

Text analysis

Amazon

TRANCE
авля
авля YT33AB
YT33AB