Human Generated Data

Title

Untitled (man on stage speaking from podium in front of "Ford" banner)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4666

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man on stage speaking from podium in front of "Ford" banner)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4666

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 94.7
Indoors 92.7
Interior Design 92.7
Person 78.6
Person 77.9
Person 77.4
Person 76.8
Person 75.3
Person 75.3
Person 74.9
People 68.2
Person 67.2
Cream 66.1
Icing 66.1
Food 66.1
Dessert 66.1
Cake 66.1
Creme 66.1
Person 65.1
Face 61.8
Person 59

Clarifai
created on 2023-10-15

people 99.4
wear 95.4
monochrome 95.3
group 95
man 94.6
adult 94.2
music 93.7
ceremony 91.3
outfit 90
wedding 88.4
no person 88.1
retro 86.6
many 86.3
several 85.3
woman 84.9
religion 82
furniture 81.9
leader 81.7
cemetery 81.7
actress 80.7

Imagga
created on 2021-12-14

sketch 56.8
drawing 52.2
representation 33.2
art 25.2
cartoon 21.4
architecture 20.4
design 18
silhouette 16.6
graphic 15.3
house 15.2
city 15.1
old 13.9
man 12.8
building 12.6
history 11.6
vintage 11.6
urban 11.4
floral 11.1
retro 10.6
color 10.6
ancient 10.4
clip art 10.2
structure 10.1
frame 10
business 9.7
pattern 9.6
icon 9.5
leaf 9.3
modern 9.1
tower 9
shape 8.9
water 8.7
scene 8.7
flower 8.5
black 8.4
town 8.3
element 8.3
balcony 8.1
antique 8.1
boutique 8
construction 7.7
texture 7.6
finance 7.6
decoration 7.6
plant 7.5
home 7.5
symbol 7.4
holiday 7.2

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 97.9
indoor 89.6
person 75.8
concert 63.5
clothing 59.9
cartoon 57.3
old 48.3
clothes 23.3
cluttered 20.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 37-55
Gender Male, 55.4%
Calm 97.5%
Happy 0.8%
Sad 0.7%
Surprised 0.6%
Confused 0.2%
Angry 0.1%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 23-37
Gender Female, 52.5%
Calm 72.8%
Surprised 12.9%
Sad 7.6%
Confused 3.9%
Fear 1.6%
Angry 0.5%
Happy 0.5%
Disgusted 0.2%

AWS Rekognition

Age 16-28
Gender Female, 79.2%
Angry 41.3%
Calm 31.6%
Sad 11.6%
Happy 9.2%
Confused 2%
Fear 2%
Surprised 1.5%
Disgusted 0.9%

AWS Rekognition

Age 48-66
Gender Male, 92.7%
Calm 90.9%
Surprised 3%
Angry 1.5%
Confused 1.3%
Sad 1.2%
Fear 0.9%
Happy 0.7%
Disgusted 0.5%

Feature analysis

Amazon

Person 78.6%

Categories

Imagga

interior objects 84.6%
paintings art 12.3%
food drinks 2.4%

Captions

Microsoft
created on 2021-12-14

an old photo of a person 64.3%
old photo of a person 63.4%
a group of people in a room 63.3%

Text analysis

Amazon

BLUE
20867.
WFIL
NAGON
L9807
KAMT
Bird
جمعد
any جمعد
any

Google

L9802 WFIL 20867.
L9802
WFIL
20867.