Human Generated Data

Title

Untitled (man holding baby)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16804

Human Generated Data

Title

Untitled (man holding baby)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16804

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.3
Apparel 99.3
Person 99
Human 99
Plant 97.1
Tree 96
Chair 92.9
Furniture 92.9
Robe 72.5
Fashion 72.5
Female 69
Meal 68.8
Food 68.8
Person 68.6
Suit 68.5
Coat 68.5
Overcoat 68.5
Gown 68
Flower 65.3
Blossom 65.3
Face 63.4
Tie 60.2
Accessories 60.2
Accessory 60.2
Sleeve 59.6
Ornament 57.7
Wedding Gown 56.8
Wedding 56.8
Woman 56.6
Evening Dress 56.1
Flower Arrangement 55.6
Animal 55.5

Clarifai
created on 2023-10-29

people 99.4
child 98.1
monochrome 97.9
two 97.2
man 95.6
family 95.3
adult 95.1
woman 95.1
love 93
offspring 89.8
sit 85.6
baby 84.9
portrait 84.4
chair 84.3
wedding 79.2
affection 78.4
facial expression 77.6
couple 76.4
son 75.5
boy 74.9

Imagga
created on 2022-02-26

musical instrument 32
wind instrument 31
man 28.2
person 28.1
people 26.2
flute 24.2
woodwind 21.3
male 19.9
adult 17.9
laptop 16.5
business 16.4
businessman 15.9
stringed instrument 15.8
sport 14.8
men 13.7
computer 13.7
brass 13.3
violin 12.8
professional 12.3
sitting 12
black 12
happy 11.9
leisure 11.6
silhouette 11.6
park 11.5
player 11.5
looking 11.2
bowed stringed instrument 11
outdoors 10.7
symbol 10.1
sax 9.9
fun 9.7
working 9.7
technology 9.6
office 9.6
muscular 9.5
work 9.4
expression 9.4
water 9.3
athlete 9.2
competition 9.1
portrait 9.1
design 9
summer 9
handsome 8.9
happiness 8.6
worker 8.6
model 8.6
communication 8.4
lights 8.3
event 8.3
device 8.3
holding 8.3
human 8.2
one 8.2
music 8.1
clothing 8.1
cornet 8
couple 7.8
art 7.8
serve 7.8
tennis 7.8
court 7.8
play 7.8
skill 7.7
attractive 7.7
sky 7.6
casual 7.6
thinking 7.6
relaxation 7.5
vacations 7.5
training 7.4
flag 7.3
group 7.3
racket 7.2
smiling 7.2
wet 7.2
bright 7.1
job 7.1
day 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 96.5
window 86.6
snow 84.1
tree 83
person 75.4
clothing 71.2
christmas tree 62.3
black and white 59.2
old 48.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Male, 99.8%
Happy 41.6%
Surprised 25.9%
Calm 18.3%
Sad 7.6%
Fear 3.3%
Disgusted 1.3%
Confused 1%
Angry 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Person 99%
Person 68.6%
Tie 60.2%

Categories

Imagga

paintings art 95%
events parties 1.1%

Text analysis

Amazon

AZ