Human Generated Data

Title

Untitled (boy holding puppy)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17557

Human Generated Data

Title

Untitled (boy holding puppy)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17557

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98.1
Human 98.1
Smoke 94.3
Photography 64.3
Photo 64.3
Portrait 63.4
Face 63.4
Smoking 61.6
Finger 61.5
Suit 56.9
Clothing 56.9
Coat 56.9
Overcoat 56.9
Apparel 56.9

Clarifai
created on 2023-10-29

monochrome 99.6
people 99.6
portrait 99.5
man 97.2
one 95.9
adult 95.6
black and white 92.5
sit 92.5
sepia 91.5
vintage 87.1
street 87
chair 86.3
nostalgia 85.8
art 85.2
elderly 83.2
old 81
wedding 79.6
retro 79.1
writer 78.3
actor 77.5

Imagga
created on 2022-02-26

person 43
adult 33.9
people 32.4
man 28.9
lifestyle 26
senior 25.3
portrait 25.2
male 24.3
mature 22.3
happy 20.1
old 19.5
grandma 19.4
face 19.2
human 18
looking 17.6
hair 17.4
outdoors 17.2
scholar 16.1
casual 15.3
black 15
one 14.9
grandfather 14.7
men 14.6
smiling 14.5
elderly 14.4
pretty 14
smile 13.5
gray 13.5
retirement 13.4
outdoor 13
lady 13
intellectual 12.9
attractive 12.6
handsome 12.5
professional 12.3
business 12.2
sitting 12
outside 12
indoor 11.9
older 11.7
city 11.6
retired 11.6
holding 11.6
age 11.4
life 11.4
home 11.2
love 11.1
street 11
day 11
happiness 11
alone 11
head 10.9
hand 10.6
look 10.5
couple 10.5
women 10.3
expression 10.2
blond 10.2
phone 10.1
communication 10.1
businesswoman 10
modern 9.8
cheerful 9.8
technology 9.7
together 9.6
model 9.3
two 9.3
park 9.1
aged 9.1
office 8.9
worker 8.9
businessman 8.8
hairdresser 8.8
urban 8.7
cute 8.6
corporate 8.6
active 8.6
enjoying 8.5
world 8.5
joy 8.4
glasses 8.3
leisure 8.3
telephone 8.3
indoors 7.9
vertical 7.9
work 7.9
60s 7.8
eyes 7.8
1 7.7
suit 7.7
using 7.7
mother 7.7
employee 7.7
clothing 7.6
only 7.6
females 7.6
mobile 7.5
camera 7.4
executive 7.4
computer 7.3
20s 7.3
girls 7.3
confident 7.3
child 7.2
pensioner 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 98.1
person 97.4
man 92.1
human face 91.9
clothing 79.3
black and white 74.5
portrait 52.4
crowd 0.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 2-10
Gender Female, 80%
Calm 98%
Sad 0.8%
Confused 0.4%
Surprised 0.4%
Angry 0.2%
Happy 0.1%
Disgusted 0.1%
Fear 0.1%

Feature analysis

Amazon

Person
Person 98.1%

Captions

Text analysis

Amazon

sel