Human Generated Data

Title

Untitled (man and woman on beach with paddle and radio)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8937

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman on beach with paddle and radio)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8937

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.6
Human 99.6
Person 97.3
Home Decor 83.5
Furniture 83
Oars 80.5
Chair 78.9
Clothing 73.5
Apparel 73.5
Water 70.3
Text 65.2
Outdoors 64.3
Face 63.7
Sitting 61.6
Photography 60.7
Photo 60.7
Advertisement 57.2
Paddle 56.5
Nature 55.8
Poster 55.5

Clarifai
created on 2023-10-25

people 99.7
adult 98.1
woman 97.7
wear 97.3
one 96.4
recreation 96
surfboarding 95.7
two 95.4
swimming pool 94.8
summer 92.5
water 92.3
man 91.6
beach 91.3
swimming 89.5
child 88.6
leisure 87.9
sports equipment 87
sun 85.7
girl 85.6
aircraft 84.8

Imagga
created on 2022-01-09

iron 100
home appliance 88.3
appliance 68.6
durables 43.4
laptop 32.2
computer 31.7
business 29.1
work 27.4
working 24.7
consumer goods 21.6
office 21
technology 18.5
professional 17.7
man 17.5
businessman 16.8
people 16.2
person 14.5
success 14.5
equipment 13.7
male 13.5
worker 13.3
keyboard 13.1
corporate 12.9
hand 12.1
manager 12.1
desk 11.9
job 11.5
home 11.2
tool 10.7
suit 10.2
finance 10.1
occupation 10.1
adult 9.7
career 9.5
construction 9.4
glasses 9.2
studio 9.1
table 9.1
holding 9.1
information 8.8
device 8.7
smiling 8.7
executive 8.6
metal 8
hands 7.8
typing 7.8
blackboard 7.6
shirt 7.5
close 7.4
businesswoman 7.3
pen 7.2
steel 7.1
smile 7.1
modern 7

Microsoft
created on 2022-01-09

text 94.5
black and white 86.2
chair 63.5
furniture 60.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Female, 98%
Calm 42.2%
Happy 39.7%
Sad 15.1%
Confused 0.7%
Fear 0.7%
Disgusted 0.6%
Angry 0.6%
Surprised 0.4%

Feature analysis

Amazon

Person 99.6%

Categories

Captions

Text analysis

Google

bIててすh
bI
ててす
h