Human Generated Data

Title

Phil Baker (1898-1963) [for Condé Nast]

Date

c. 1930

People

Artist: Edward Steichen, American 1879 - 1973

Sitter: Phil Baker,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Edward Steichen by direction of Joanna T. Steichen and the George Eastman House, P1982.14

Copyright

© The Estate of Edward Steichen / Artists Rights Society (ARS), New York

Human Generated Data

Title

Phil Baker (1898-1963) [for Condé Nast]

People

Artist: Edward Steichen, American 1879 - 1973

Sitter: Phil Baker,

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Edward Steichen by direction of Joanna T. Steichen and the George Eastman House, P1982.14

Copyright

© The Estate of Edward Steichen / Artists Rights Society (ARS), New York

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 98.8
Human 98.8
Person 95.7
Musical Instrument 88.1
Accordion 88.1

Clarifai
created on 2023-10-15

music 99.5
concert 99.3
performance 99.2
people 99.1
stage 98.5
musician 98.4
portrait 97.5
adult 96.9
one 96.8
monochrome 95
woman 94.5
singer 94.3
still life 92.2
man 91.3
theatre 91.2
sculpture 91
stadium 90.8
side view 90.7
shadow 90.6
band 90.3

Imagga
created on 2021-12-15

accordion 100
keyboard instrument 100
wind instrument 100
musical instrument 100
concertina 36.8
free-reed instrument 29.5
man 26.2
person 22.4
male 21.3
adult 20.7
people 16.7
black 14.4
studio 13.7
business 13.4
model 13.2
holding 13.2
portrait 12.9
face 12.8
attractive 12.6
hair 11.9
music 11.7
fashion 11.3
hand 10.6
businessman 10.6
lady 10.5
musical 10.5
clothing 9.8
pretty 9.8
human 9.7
color 9.4
men 9.4
smiling 9.4
happy 9.4
instrument 9.3
one 9
musician 8.8
women 8.7
play 8.6
expression 8.5
old 8.4
retro 8.2
style 8.2
sexy 8
lifestyle 7.9
happiness 7.8
art 7.8
band 7.8
money 7.7
costume 7.6
casual 7.6
dark 7.5
sound 7.5
entertainment 7.4
light 7.3
20s 7.3
make 7.3
success 7.2
suit 7.2
handsome 7.1
device 7.1
paper 7.1
pipe 7

Microsoft
created on 2021-12-15

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 7-17
Gender Male, 93.2%
Calm 90.7%
Sad 4.1%
Surprised 2.4%
Confused 0.9%
Angry 0.7%
Fear 0.5%
Happy 0.5%
Disgusted 0.3%

AWS Rekognition

Age 32-48
Gender Male, 99.4%
Sad 49.1%
Confused 17.1%
Calm 14.6%
Fear 8.7%
Surprised 3.7%
Angry 3%
Disgusted 2.7%
Happy 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%

Categories

Text analysis

Amazon

1075-2
7075.2 U.F.

Google

7075.2 U.F 1075-2
7075.2
U.F
1075-2