Human Generated Data

Title

Untitled (woman seated on garden bench)

Date

1935

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2008

Human Generated Data

Title

Untitled (woman seated on garden bench)

People

Artist: Hamblin Studio, American active 1930s

Date

1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2008

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Bench 99.3
Furniture 99.3
Person 98.8
Human 98.8
Text 88.2
Drawing 76.7
Art 76.7
Newspaper 72.5
Sketch 63.9
Architecture 63.8
Building 63.8
Arch 59.7
Arched 59.7
Chair 57.4
Bus Stop 55.3

Clarifai
created on 2023-10-25

people 99.7
furniture 98.7
seat 97.9
one 96.2
man 95.8
bench 95.3
adult 95.2
chair 94.7
monochrome 93.5
sit 93
vintage 92.9
print 92.1
home 89.8
child 88.8
street 88.2
art 88.1
old 86.6
two 86
black and white 84.1
indoors 83.8

Imagga
created on 2021-12-14

building 50.7
structure 45.9
chair 41.3
greenhouse 40.4
seat 35.1
architecture 30.4
support 20.4
old 18.1
device 17.7
rocking chair 17.1
sky 15.9
travel 15.5
carousel 14.9
house 14.7
furniture 13.7
rest 13.7
urban 13.1
tourism 12.4
famous 12.1
ride 11.7
mechanical device 11.7
religion 11.6
city 11.6
history 11.6
water 11.3
ancient 11.2
construction 11.1
armrest 10.9
landmark 10.8
outdoor 10.7
palace 10.3
tree 10
fountain 9.8
outside 9.4
culture 9.4
stone 9.4
snow 9.2
traditional 9.1
landscape 8.9
forest 8.7
empty 8.6
park 8.5
historical 8.5
wood 8.3
church 8.3
style 8.2
mechanism 8.1
bench 8.1
trees 8
art 7.8
winter 7.7
sculpture 7.6
statue 7.6
monument 7.5
outdoors 7.5
iron 7.5
vintage 7.4
street 7.4
historic 7.3
metal 7.2
black 7.2
tool 7.1
river 7.1
temple 7.1
summer 7.1
sea 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 98.3
drawing 94.1
sketch 88.8
furniture 86.3
building 84
black and white 74.8
chair 70.5
old 62
stone 15.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Female, 97.1%
Calm 87.8%
Happy 4.4%
Surprised 2.2%
Confused 2%
Sad 1.9%
Disgusted 1%
Fear 0.5%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Bench 99.3%
Person 98.8%

Categories