Human Generated Data

Title

Untitled (baby in chair next to stuffed bunny)

Date

c. 1950

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21676

Human Generated Data

Title

Untitled (baby in chair next to stuffed bunny)

People

Artist: John Howell, American active 1930s-1960s

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21676

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Chair 99.8
Furniture 99.8
Clothing 98.1
Apparel 98.1
Person 86.3
Human 86.3
Shoe 81.7
Footwear 81.7
Floor 64.7
Portrait 62.9
Photography 62.9
Face 62.9
Photo 62.9
Stroller 57.8

Clarifai
created on 2023-10-22

people 97.9
man 95.4
science 93.3
child 93.1
Halloween 93
no person 92.1
illustration 91.9
robot 91.5
retro 91
futuristic 90.6
wear 88.6
technology 87.8
nostalgia 87.4
one 85.7
adult 84.3
indoors 84
medicine 83.9
forthcoming 82.7
astronaut 82.6
monochrome 81.8

Imagga
created on 2022-03-05

device 35.5
computer 21.4
technology 20
business 19.4
laptop 18.9
office 17.8
support 17.4
work 16.5
chair 16.2
gramophone 15.9
machine 15.2
equipment 14.3
loudspeaker 14.2
bookend 14.1
desk 12.5
people 12.3
keyboard 12.2
seat 12
person 11.9
notebook 11.8
working 11.5
record player 11.4
job 10.6
adult 9.7
object 9.5
kitchenware 9.5
smiling 9.4
communication 9.2
pen 8.6
3d 8.5
table 8.5
modern 8.4
retro 8.2
happy 8.1
metal 8
smile 7.8
scanner 7.8
corporate 7.7
professional 7.6
manager 7.4
vintage 7.4
style 7.4
suit 7.3
success 7.2
black 7.2
worker 7.1
silver 7.1

Microsoft
created on 2022-03-05

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-31
Gender Male, 86.4%
Surprised 75.1%
Calm 15.8%
Fear 6.2%
Happy 1.3%
Disgusted 0.7%
Angry 0.4%
Confused 0.2%
Sad 0.2%

Feature analysis

Amazon

Person
Shoe
Person 86.3%
Shoe 81.7%

Categories

Imagga

interior objects 72%
paintings art 27.2%

Captions

Text analysis

Amazon

aos
JEANS
STARRA

Google

YT3RA2- AGON
YT3RA2-
AGON