Human Generated Data

Title

Untitled (man and woman sitting in chairs outside)

Date

1934

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21920

Human Generated Data

Title

Untitled (man and woman sitting in chairs outside)

People

Artist: Hamblin Studio, American active 1930s

Date

1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21920

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99.6
Human 99.6
Clothing 77.6
Apparel 77.6
Coat 69.8
Chair 68.6
Furniture 68.6
Table 64.1
Building 57.9
Text 56.1

Clarifai
created on 2023-10-22

people 98.7
monochrome 97
wear 96.6
street 92.8
outfit 86.4
adult 85.9
two 85.6
man 85.2
indoors 84.2
commerce 82.5
group 82.3
furniture 82.1
art 81.7
woman 79.8
retro 79.7
window 79.6
group together 79.5
design 77.4
music 76.8
fashion 75.9

Imagga
created on 2022-03-11

device 23.3
building 15.1
equipment 13.7
industry 11.9
ventilator 11.7
business 11.5
metal 11.3
sky 10.8
work 10.7
steel 10.7
modern 10.5
architecture 10.2
man 10.1
technology 9.6
construction 9.4
power 9.2
city 9.1
people 8.9
travel 8.4
black 8.4
safety 8.3
worker 8.3
structure 8.2
chair 8.2
danger 8.2
industrial 8.2
new 8.1
design 7.9
male 7.8
scene 7.8
factory 7.7
instrument 7.7
house 7.5
prison 7.5
street 7.4
light 7.3
interior 7.1
working 7.1

Microsoft
created on 2022-03-11

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 52-60
Gender Male, 66.5%
Calm 99.9%
Surprised 0%
Fear 0%
Sad 0%
Disgusted 0%
Happy 0%
Confused 0%
Angry 0%

Feature analysis

Amazon

Person
Person 99.6%

Categories

Captions

Text analysis

Amazon

was
was ASCH3REC
ASCH3REC