Human Generated Data

Title

Bernard Berenson (1865-1959)

Date

c. 1903

People

Artist: Sarah Choate Sears, American 1858 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Montgomery S. Bradley and Cameron Bradley, P1984.57

Human Generated Data

Title

Bernard Berenson (1865-1959)

People

Artist: Sarah Choate Sears, American 1858 - 1935

Date

c. 1903

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Montgomery S. Bradley and Cameron Bradley, P1984.57

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Face 100
Human 100
Beard 97.1
Suit 87.6
Clothing 87.6
Apparel 87.6
Overcoat 87.6
Coat 87.6
Person 81.5
Man 75.6
Portrait 64.1
Photography 64.1
Photo 64.1
Painting 57.2
Art 57.2
Crowd 56

Clarifai
created on 2023-10-26

portrait 99.8
people 99.7
monochrome 98.9
one 98.4
man 98.1
music 97.3
adult 96.1
art 95.3
musician 94.3
light 92.7
telephone 92.6
profile 92.3
window 91.8
baby 91.8
girl 91.6
smoke 90.3
vintage 90
model 89.5
self 88.6
book series 88.1

Imagga
created on 2022-01-23

portrait 27.8
person 25.3
face 24.1
negative 23
man 22.8
adult 21.3
male 21.3
device 19.9
looking 19.2
film 18.8
head 18.5
hair 18.2
black 18
people 17.8
human 17.2
eyes 17.2
expression 17.1
attractive 15.4
old 15.3
senior 15
photographic paper 13.8
happy 13.8
close 13.7
computer 13.4
laptop 13.3
look 13.1
one 12.7
model 12.4
fashion 12.1
sexy 12
sad 11.6
lady 11.4
pretty 11.2
smiling 10.8
holding 10.7
smile 10.7
brunette 10.5
work 10.2
cute 10
hand 9.9
working 9.7
elderly 9.6
home 9.6
serious 9.5
mouth 9.4
photographic equipment 9.3
dark 9.2
blackboard 9.2
retired 8.7
boy 8.7
hands 8.7
business 8.5
emotion 8.3
blond 8.2
style 8.2
office 8
lifestyle 7.9
behind 7.8
technology 7.4
lips 7.4
paper 7.3
pose 7.2
aged 7.2
eye 7.1
handsome 7.1
mustache 7.1
call 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

human face 98.8
text 98.1
person 94.9
man 85.7
portrait 83.1
black and white 75.9
black 70.2
white 65.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Male, 100%
Calm 99.6%
Sad 0.2%
Confused 0.1%
Surprised 0%
Angry 0%
Fear 0%
Disgusted 0%
Happy 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 81.5%
Painting 57.2%

Categories

Imagga

paintings art 99.6%

Captions

Microsoft
created on 2022-01-23

an old photo of a man 65.8%
old photo of a man 61.3%
a man in a suit and tie 61.2%