Human Generated Data

Title

Untitled (baby on changing table)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17858

Human Generated Data

Title

Untitled (baby on changing table)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17858

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Furniture 99.9
Human 90.4
Person 86.2
Bed 86
Funeral 82.4
Cradle 71
Couch 70.3
Crib 68.1
Room 62.6
Indoors 62.6
Chair 57.8
Flower 55.1
Plant 55.1
Blossom 55.1

Clarifai
created on 2023-10-29

monochrome 99.4
sleep 98.1
people 97.7
bed 96.3
furniture 91.8
wear 91.2
room 90.1
street 88.6
hospital 87.9
adult 87.2
family 86
reclining 85.7
woman 84.9
man 83.9
light 83.3
girl 82.8
indoors 82
nude 81.3
one 80.9
administration 80.3

Imagga
created on 2022-02-26

white goods 47.2
dishwasher 43.6
home appliance 42.1
appliance 35.1
device 27.2
car 25.3
equipment 22.8
transportation 18.8
vehicle 18.8
iron lung 18.4
auto 18.2
technology 17.8
automobile 16.3
drive 16.1
washer 16
respirator 14.7
transport 14.6
durables 14.5
old 13.9
speed 13.7
power 13.4
travel 13.4
close 12
apparatus 11.9
black 11.4
breathing device 11
headlight 10.8
metal 10.4
incubator 10.3
object 10.2
plastic 10.1
vessel 9.9
business 9.7
detail 9.6
wheel 9.4
seat 9.4
fast 9.3
graphics 9.1
modern 9.1
design 9
interior 8.8
home 8.8
bag 8.7
storage 8.6
media 8.6
industry 8.5
3d 8.5
side 8.4
three dimensional 8.4
retro 8.2
music 8.1
digital 8.1
light 8
box 7.9
machine 7.8
render 7.8
stove 7.8
play 7.7
electric 7.6
effects 7.6
electronics 7.6
chrome 7.5
contemporary 7.5
vintage 7.4
classic 7.4
closeup 7.4
road 7.2
computer 7.2
bathtub 7.1
silver 7.1

Microsoft
created on 2022-02-26

indoor 94.5
black and white 91.9
text 70.2
clothes 18.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 0-6
Gender Female, 83.4%
Calm 72.4%
Sad 21.5%
Surprised 2%
Happy 1.4%
Fear 1.3%
Disgusted 0.6%
Angry 0.5%
Confused 0.3%

Feature analysis

Amazon

Person
Person 86.2%

Categories

Captions

Microsoft
created on 2022-02-26

a person sitting on a bed 27%

Text analysis

Amazon

VOI
KODOKSVLELA

Google

YT37A2- AGO
YT37A2-
AGO