Human Generated Data

Title

Untitled (man seated near piano in trailer)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7580

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man seated near piano in trailer)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7580

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Studio 95.6
Person 91.2
Human 91.2
Chair 88.7
Furniture 88.7
Electronics 77.6
Keyboard 70.1
Screen 64.4
Display 61.7
Monitor 61.7
Train 59.6
Transportation 59.6
Vehicle 59.6
Interior Design 59.1
Indoors 59.1
Leisure Activities 58.9

Clarifai
created on 2023-10-25

people 99.6
room 98.7
monochrome 98.5
furniture 98.1
chair 96.5
analogue 95.9
indoors 94.6
adult 94.5
piano 94
seat 93.9
two 92.6
instrument 92
group 90.8
desk 90.7
music 90.5
one 90.3
man 88
jazz 87.5
employee 85.9
black and white 85.8

Imagga
created on 2022-01-08

barbershop 58.4
shop 49
chair 44.3
interior 36.3
mercantile establishment 35.1
room 33.3
seat 32.5
furniture 30.9
barber chair 30.2
modern 25.2
office 23.8
place of business 23.6
indoors 22.9
table 22.7
inside 22.1
business 18.8
transportation 17.9
light 16.7
equipment 16.3
restaurant 15.9
floor 15.8
car 15.1
house 14.2
decor 14.2
architecture 14.1
empty 13.8
window 13.7
home 13.6
travel 13.4
lamp 13.4
work 13.4
design 13
luxury 12.9
building 12.7
comfortable 12.4
working 12.4
urban 12.2
kitchen 12.1
industry 12
transport 11.9
establishment 11.8
chairs 11.8
people 11.7
center 11.6
glass 10.9
nobody 10.9
machine 10.7
wall 10.3
passenger 10.2
vehicle 10.1
wood 10
computer 9.7
men 9.4
3d 9.3
structure 9.3
city 9.1
indoor 9.1
technology 8.9
device 8.8
man 8.7
decoration 8.7
scene 8.7
automobile 8.6
life 8.6
dining 8.6
train 8.5
relax 8.4
fast 8.4
elegance 8.4
horizontal 8.4
street 8.3
speed 8.2
counter 8.2
furnishing 8.2
reflection 8.1
monitor 8
space 7.8
motion 7.7
apartment 7.7
traffic 7.6
dinner 7.6
bar 7.4
hospital 7.3
steel 7.1
businessman 7.1

Microsoft
created on 2022-01-08

text 99.2
black 82.7
black and white 80.7
furniture 79.6
white 75.9
piano 65.6

Color Analysis

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 91.2%
Chair 88.7%
Train 59.6%

Categories

Text analysis

Amazon

es
BOAT
39846
KODAK-EEA

Google

sa
YT
ヨヨ
A
°
2-
AGOM
2
sa YTヨヨA°2- AGOM 2カ868
868