Human Generated Data

Title

Untitled (window reflection)

Date

1930

People

Artist: Eliot Elisofon, American 1911 - 1973

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, by exchange, P2000.27

Human Generated Data

Title

Untitled (window reflection)

People

Artist: Eliot Elisofon, American 1911 - 1973

Date

1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, by exchange, P2000.27

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Home Decor 98.5
Human 98.1
Person 98.1
Meal 73.6
Food 73.6
Window 67.2
Apparel 65
Clothing 65
Spoke 61.5
Machine 61.5
Restaurant 61
Female 60.1
Outdoors 59.3
Handrail 56.4
Banister 56.4
Cafe 55.4

Clarifai
created on 2023-10-15

people 99.9
monochrome 97.9
two 97.3
one 95.7
administration 95.2
street 95
adult 95
man 94.8
wear 94.8
portrait 93.7
child 93.2
group 92
fence 91.5
many 91
group together 88.8
three 88.3
woman 85.3
leader 83.1
commerce 82.1
home 82.1

Imagga
created on 2021-12-14

newspaper 77.5
product 60.8
creation 46.9
city 27.4
building 25.1
structure 24.9
architecture 21.6
fountain 19.9
old 19.5
vintage 16.5
grunge 15.3
antique 14.7
travel 14.1
new 12.1
ancient 12.1
urban 11.4
water 11.3
people 11.2
historic 11
house 11
paper 11
sky 10.8
landmark 10.8
man 10.7
retro 10.6
sculpture 10.5
skyline 10.4
street 10.1
light 10
aged 10
negative 9.9
tourism 9.9
tower 9.8
art 9.8
shopping cart 9.5
fence 9.3
tree 9.2
film 9.2
dirty 9
text 8.7
statue 8.7
window 8.6
brick 8.2
history 8
palace 7.9
stone 7.8
black 7.8
construction 7.7
handcart 7.6
texture 7.6
person 7.6
grungy 7.6
pattern 7.5
outdoors 7.5
ornate 7.3
skyscraper 7.3
paint 7.2
river 7.1
wall 7.1
male 7.1
summer 7.1
scenic 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

person 78.9
black and white 73.2
white 69.5
clothing 66.1
street 64
text 59.3
monochrome 53.1
cage 38.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-44
Gender Male, 63.5%
Calm 94.8%
Sad 3%
Surprised 1.3%
Happy 0.4%
Disgusted 0.2%
Angry 0.2%
Confused 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.1%

Categories

Captions

Microsoft
created on 2021-12-14

a person sitting in a cage 63.8%
a cat sitting on top of a cage 34.7%
a close up of a cage 34.6%

Text analysis

Google

DO00000
DO00000