Human Generated Data

Title

Untitled (Haedress Ball: man and women with very elaborate headresses)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5658

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Haedress Ball: man and women with very elaborate headresses)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5658

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.7
Human 99.7
Person 99.4
Person 99.1
Person 98
Person 97.2
Clothing 90.7
Apparel 90.7
Person 89.4
Face 75.5
People 71.7
Drawing 68.9
Art 68.9
Crowd 60.8
Poster 60
Advertisement 60
Sketch 57.5

Clarifai
created on 2023-10-15

people 99.8
group 98.9
adult 98.7
man 97.8
monochrome 93.7
illustration 91.8
music 91.4
vehicle 90.4
musician 89.1
desktop 88.6
many 88.3
woman 86.5
retro 86.1
vintage 85.4
leader 84.3
nostalgia 84.2
actor 83.7
art 83.1
portrait 81.4
print 80.8

Imagga
created on 2021-12-15

brass 57.3
wind instrument 52
cornet 47.1
sax 44.2
musical instrument 29.3
negative 24.1
film 22.6
male 19.8
people 19.5
businessman 18.5
person 17.7
trombone 16.8
man 16.8
drawing 16.6
business 15.8
black 15.6
silhouette 14.9
work 13.3
photographic paper 13
manager 12.1
design 11.8
professional 11.8
adult 11
chart 10.5
art 10.5
symbol 10.1
old 9.7
portrait 9.7
group 9.7
technology 9.6
engineer 9.6
looking 9.6
serious 9.5
plan 9.4
men 9.4
grunge 9.4
human 9
sky 8.9
job 8.8
photographic equipment 8.7
pencil 8.6
hand 8.3
new 8.1
office 8
building 7.9
designing 7.9
pensive 7.8
space 7.8
construction 7.7
engineering 7.6
horn 7.6
elegance 7.6
style 7.4
device 7.4
event 7.4
graphic 7.3
team 7.2
women 7.1
day 7.1
architecture 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 99.8
posing 95.5
old 95.5
drawing 94.1
window 93
sketch 92.2
cartoon 89.5
man 77.5
clothing 68.7
person 67.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 28-44
Gender Male, 71.2%
Surprised 54%
Calm 11.8%
Angry 11.1%
Sad 9.8%
Disgusted 6.6%
Happy 2.5%
Confused 2.3%
Fear 1.8%

AWS Rekognition

Age 19-31
Gender Female, 91.4%
Angry 59%
Happy 19.1%
Surprised 14.5%
Fear 3.4%
Calm 1.7%
Confused 1.7%
Sad 0.3%
Disgusted 0.3%

AWS Rekognition

Age 22-34
Gender Female, 95%
Surprised 47.6%
Calm 24.7%
Confused 9.1%
Angry 8.5%
Happy 7.4%
Sad 1.5%
Disgusted 0.8%
Fear 0.4%

AWS Rekognition

Age 18-30
Gender Female, 94.5%
Calm 79.8%
Confused 5.6%
Sad 4%
Angry 3.9%
Surprised 3.9%
Happy 1.5%
Fear 0.8%
Disgusted 0.5%

AWS Rekognition

Age 26-40
Gender Female, 73%
Happy 83.9%
Calm 10.9%
Sad 3.2%
Angry 0.8%
Fear 0.5%
Confused 0.3%
Surprised 0.2%
Disgusted 0.2%

Feature analysis

Amazon

Person 99.7%
Poster 60%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

14599
14599.
ИАЧЯЗЯЦ
28399
28399 ИАЧЯЗЯЦ ALL
ALL

Google

14599. 14599
14599.
14599