Human Generated Data

Title

Untitled (woman feeding baby in high chair)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17350

Human Generated Data

Title

Untitled (woman feeding baby in high chair)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17350

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98.5
Human 98.5
Person 97.1
Furniture 92.7
Chair 88.1
Face 87.8
Clothing 74.2
Apparel 74.2
Outdoors 72.6
Portrait 71.9
Photography 71.9
Photo 71.9
Nature 71.3
Meal 65.5
Food 65.5
Table 61.9
Dish 61.2
Dining Table 60.8
Baby 60
Person 59.1
Female 55.3
Sitting 55.1

Clarifai
created on 2023-10-29

people 99.8
child 98.8
two 96.3
monochrome 95.1
adult 94.6
man 94.1
woman 92.3
group 90.9
recreation 89.8
wear 88.4
group together 87.2
boy 86.8
son 86.7
family 85.7
three 85.6
vehicle 85.5
actor 84.4
baby 82.9
nostalgia 82.8
interaction 82

Imagga
created on 2022-02-26

man 34.2
brass 34
cornet 30.2
wind instrument 27.8
male 27.7
chair 25.7
barber chair 25.6
musical instrument 23.3
person 21.7
people 20.6
seat 18.1
men 18
worker 16.9
hairdresser 16.8
device 15.2
adult 14.4
outdoors 14.2
work 13.4
working 13.2
equipment 12.9
couple 12.2
furniture 11.8
human 11.2
old 11.1
happy 10.6
hand 10.6
youth 10.2
occupation 10.1
playing 10
house 10
repair 9.6
building 9.5
day 9.4
construction 9.4
industry 9.4
senior 9.4
focus 9.3
black 9
job 8.8
home 8.8
horn 8.7
lifestyle 8.7
skill 8.7
music 8.5
leisure 8.3
industrial 8.2
machine 8
steel 8
smiling 7.9
together 7.9
sitting 7.7
outside 7.7
two 7.6
holding 7.4
safety 7.4
lady 7.3
teenager 7.3
instrument 7.3
metal 7.2
portrait 7.1
professional 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 98.8
person 98.7
black and white 77.8
human face 77.6
clothing 76.2
drawing 75.5
old 63.4
cooking 20.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 40-48
Gender Male, 76.5%
Calm 99.7%
Surprised 0.1%
Sad 0.1%
Disgusted 0%
Confused 0%
Happy 0%
Fear 0%
Angry 0%

Feature analysis

Amazon

Person
Person 98.5%
Person 97.1%
Person 59.1%

Categories