Human Generated Data

Title

Untitled (wedding guests seated on stairs)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8595

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (wedding guests seated on stairs)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8595

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.1
Human 99.1
Person 95.1
Apparel 90.3
Clothing 90.3
Person 89.4
Accessories 86.8
Sunglasses 86.8
Accessory 86.8
Poster 83.4
Advertisement 83.4
Shoe 76.6
Footwear 76.6
Nature 66.5
Sunglasses 63.7
Outdoors 62
Collage 58.8
Ice 58.2
Shorts 56.3
Female 56.2
LCD Screen 55.9
Electronics 55.9
Screen 55.9
Monitor 55.9
Display 55.9
Person 43.2

Clarifai
created on 2023-10-25

people 99.5
monochrome 99
adult 96.9
man 96.4
street 95.3
music 93.8
wear 91.7
woman 91.2
group 88.9
art 88.6
group together 87
fashion 85.9
two 85.6
one 84.6
portrait 84.2
musician 81.8
recreation 81.7
model 81.7
singer 79.9
design 78.6

Imagga
created on 2022-01-09

locker 42.8
fastener 34.5
restraint 25.9
device 22.4
person 21.8
black 17.4
man 14.1
people 13.9
portrait 12.9
sexy 12.8
house 11.7
adult 11.7
city 11.6
blackboard 11.4
fashion 11.3
home 11.2
male 10.6
glass 10.6
modern 10.5
urban 10.5
wall 10.3
shop 10.2
lifestyle 10.1
clothing 10
window 9.8
standing 9.6
happy 9.4
water 9.3
model 9.3
style 8.9
worker 8.9
indoors 8.8
business 8.5
door 8.4
shower 8.2
equipment 8
interior 8
work 7.8
men 7.7
one 7.5
design 7.3
lady 7.3
building 7.2
looking 7.2
kitchen 7.1
women 7.1
life 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.3
drawing 89.6
black and white 88.6
clothing 73.8
cartoon 73.3
person 68.3
sketch 54.5
poster 52.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 99.9%
Calm 99.2%
Sad 0.4%
Angry 0.1%
Surprised 0.1%
Disgusted 0.1%
Confused 0.1%
Happy 0.1%
Fear 0%

AWS Rekognition

Age 39-47
Gender Female, 63.1%
Calm 84.8%
Happy 10%
Confused 1.7%
Surprised 1.2%
Sad 1%
Angry 0.7%
Disgusted 0.5%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Sunglasses 86.8%
Poster 83.4%
Shoe 76.6%

Categories

Captions

Microsoft
created on 2022-01-09

a man doing a trick on a skateboard 31.4%

Text analysis

Amazon

11
17767.
.99LLI

Google

-.
11
17766.
-. ר ררו 11 17766.
ר
ררו