Human Generated Data

Title

Untitled (woman posing with street sign, Hollywood, California)

Date

1933, printed later

People

Artist: Delmar Watson, American 1926 - 2008

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1064

Human Generated Data

Title

Untitled (woman posing with street sign, Hollywood, California)

People

Artist: Delmar Watson, American 1926 - 2008

Date

1933, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1064

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 96.9
Person 96.9
Car 90.9
Transportation 90.9
Vehicle 90.9
Automobile 90.9
Art 87.6
Sculpture 85.5
Home Decor 78.7
Statue 75.4
Machine 65.5
Wheel 65.5
Plant 59.6
Building 56.9
Pillar 56.9
Column 56.9
Architecture 56.9

Clarifai
created on 2023-10-25

people 99.8
one 99.5
portrait 99.1
woman 97.3
monochrome 97.2
street 95.7
art 95.6
adult 94.7
man 93.9
two 93.1
child 90
retro 88.6
leader 87.9
bill 87.6
wear 83.7
facial expression 83.7
music 82.5
girl 82.4
vintage 81.9
statue 81

Imagga
created on 2022-01-08

architecture 27
building 26.3
sketch 24.8
chime 21.6
drawing 20.6
old 20.2
structure 18.9
window 17.9
percussion instrument 17.4
city 16.6
house 15.9
representation 14.8
sky 13.4
musical instrument 13.1
wall 12.8
travel 12.7
vintage 12.4
tourism 12.4
street 12
ancient 11.2
town 11.1
device 11.1
religion 10.7
light 10.7
door 10.7
sculpture 10.6
stone 10.6
lamp 10.4
antique 10.4
urban 9.6
historical 9.4
culture 9.4
church 9.2
historic 9.2
frame 9.2
memorial 9.1
retro 9
history 8.9
entrance 8.7
grunge 8.5
gravestone 8.4
detail 8
lantern 7.8
snow 7.7
art 7.5
exterior 7.4
water 7.3
bell 7.3
black 7.2
home 7.2
tower 7.2

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 96.6
black and white 86.4
art 84.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 33-41
Gender Female, 99.8%
Calm 80.9%
Happy 15.4%
Confused 1.7%
Disgusted 0.9%
Sad 0.6%
Angry 0.3%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 23-33
Gender Female, 100%
Surprised 86.3%
Calm 6.5%
Happy 2.4%
Fear 1.3%
Sad 1.1%
Angry 0.8%
Disgusted 0.8%
Confused 0.7%

Microsoft Cognitive Services

Age 32
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.9%
Car 90.9%
Wheel 65.5%

Categories

Imagga

paintings art 99.9%

Captions

Microsoft
created on 2022-01-08

an old photo of a person 68.5%
old photo of a person 64.8%
a vintage photo of a person 60.3%

Text analysis

Amazon

Claus

Google

Nefice of Local lu laus
Local
lu
laus
Nefice
of