Human Generated Data

Title

Untitled (couple standing behind table of food in dining room)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7843

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple standing behind table of food in dining room)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 99.4
Person 99.4
Person 98.9
People 90.6
Cake 90.1
Cream 90.1
Creme 90.1
Dessert 90.1
Icing 90.1
Food 90.1
Furniture 89.1
Room 89.1
Indoors 89.1
Plant 78.4
Table 77.9
Shop 76.7
Clothing 73.8
Apparel 73.8
Dining Table 73.5
Meal 73.1
Flower 65.3
Blossom 65.3
Dress 62.8
Sweets 62.5
Confectionery 62.5
Dining Room 59.1
Bakery 58
Dish 57.8
Birthday Party 56.8

Imagga
created on 2022-01-09

computer 25.9
table 23.6
laptop 22.6
equipment 21.4
device 21.4
electronic equipment 21.3
technology 20.8
desk 20.2
business 19.4
home 19.1
office 18.7
furniture 16.5
interior 15
person 14.9
communication 14.3
adult 13.6
tabletop 12.8
modern 12.6
man 12.3
lifestyle 12.3
room 12.1
happy 11.9
phone 11.1
professional 11.1
telephone 11
work 11
vintage 10.7
people 10.6
stove 10.6
indoors 10.5
old 10.4
sitting 10.3
black 10.2
architecture 10.1
smiling 10.1
house 10
businesswoman 10
working 9.7
floor 9.3
glass 9.3
retro 9
receiver 9
typing 8.8
antique 8.7
wireless 8.6
attractive 8.4
record changer 8.3
food 7.9
design 7.9
smile 7.8
education 7.8
portrait 7.8
corporate 7.7
lunch 7.7
kitchen 7.6
radio receiver 7.6
gramophone 7.6
dinner 7.6
wood 7.5
study 7.5
notebook 7.4
style 7.4
inside 7.4
object 7.3
connection 7.3
mechanism 7.2
handsome 7.1
male 7.1
information 7.1
dial 7.1
job 7.1
decor 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 95.3
indoor 88.1
old 86
table 84.5
birthday cake 71.6
candle 66.8
white 64
wedding cake 54.3
vintage 43

Face analysis

Amazon

AWS Rekognition

Age 36-44
Gender Female, 99.3%
Happy 51.9%
Calm 37.2%
Sad 4.8%
Angry 2.5%
Confused 1.5%
Fear 0.9%
Disgusted 0.8%
Surprised 0.5%

AWS Rekognition

Age 48-54
Gender Male, 99.8%
Sad 64.7%
Happy 22.9%
Confused 4.6%
Surprised 3.6%
Calm 1.8%
Angry 1.1%
Fear 0.7%
Disgusted 0.5%

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a vintage photo of a group of people in a room 92.9%
a vintage photo of a person 87.1%
a vintage photo of some people in a room 87%

Text analysis

Amazon

50
#0131

Google

--
50 |ヨー-Y丁3A2--AOOX
|
50
AOOX
-Y
3A2