Human Generated Data

Title

Untitled (men seated at long table)

Date

1952

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20171

Human Generated Data

Title

Untitled (men seated at long table)

People

Artist: Peter James Studio, American

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20171

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99
Human 99
Chair 94.8
Furniture 94.8
Person 93.7
Flooring 92.7
Restaurant 92.6
Sitting 87.5
Meal 87.1
Food 87.1
Person 87
Person 86.1
Wood 77.6
Cafe 75.4
Table 74.9
Floor 74.6
Person 68.4
Plywood 67.8
Handrail 67.1
Banister 67.1
Couch 61.7
Cafeteria 61.3
Clothing 59.6
Apparel 59.6
Face 59.4
Suit 58.8
Coat 58.8
Overcoat 58.8
Female 57.8
Dish 56.4
Person 41.5

Clarifai
created on 2023-10-22

wedding 98.3
people 98
room 96.6
monochrome 96.6
furniture 96
indoors 94.8
table 94.4
chair 93.9
woman 93.7
mirror 93.5
luxury 92.1
hotel 91.5
bride 91.5
adult 90.4
window 88.2
man 87.8
family 87.1
couple 85.7
girl 84.9
two 84.6

Imagga
created on 2022-03-05

room 53.6
interior 52.2
table 44.6
office 39.2
counter 37.4
modern 35.8
furniture 34.5
chair 34.4
home 32.1
house 31.8
floor 30.7
indoors 29
design 25.9
window 24.9
decor 23
inside 22.1
indoor 21.9
luxury 21.5
desk 19.7
architecture 19.5
light 19.4
wood 19.2
lamp 19.1
kitchen 18.3
business 18.2
apartment 18.2
style 17.1
glass 16.4
comfortable 16.2
decoration 15.9
computer 15.5
restaurant 15
empty 14.6
sofa 14.6
dining 14.3
relaxation 14.2
nobody 14
wall 13.8
chairs 13.7
living 13.3
cabinet 13.2
contemporary 13.2
sink 12.9
people 12.8
residential 12.5
laptop 12.4
seat 12.3
meeting 12.3
sitting 12
work 11.8
elegance 11.8
barbershop 11.7
businessman 11.5
mirror 11.4
man 11.4
shop 11.2
corporate 11.2
elegant 11.1
domestic 10.9
vase 10.6
estate 10.5
black 10.2
communication 10.1
clean 10
classroom 9.9
hotel 9.6
pen 9.4
lifestyle 9.4
hall 9.4
person 9.3
space 9.3
dinner 9.3
drink 9.2
tile 9.1
building 8.9
new 8.9
group 8.9
tables 8.9
working 8.8
conference 8.8
lighting 8.7
3d 8.5
male 8.5
rest 8.5
service 8.3
life 8.2
stylish 8.1
center 8.1
team 8.1
adult 8
smiling 8
cabinets 7.9
faucet 7.9
stove 7.9
food 7.9
oven 7.9
day 7.9
bathroom 7.7
couch 7.7
workplace 7.6
executive 7.6
real 7.6
plant 7.5
company 7.4
teamwork 7.4
coffee 7.4
mercantile establishment 7.4
bar 7.4
refrigerator 7.3
furnishing 7.3
color 7.2
professional 7.2
bright 7.2

Google
created on 2022-03-05

Photograph 94.2
Black 89.5
Table 85.9
Interior design 85.6
Black-and-white 85.6
Style 84
Suit 77.2
Desk 76.6
Chair 75.6
Rectangle 75.1
Font 74.3
Snapshot 74.3
Monochrome photography 72.8
Monochrome 72.2
Event 71.1
Plant 69.6
Automotive design 69.4
Building 69.1
Room 68.6
Art 68.6

Microsoft
created on 2022-03-05

indoor 92.2
window 90.6
text 88.3
black and white 86.9
black 72.4
person 71.3
table 68.7
white 67.7
clothing 53.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Female, 88.6%
Calm 78.5%
Sad 17.8%
Angry 1.3%
Fear 0.8%
Confused 0.5%
Surprised 0.4%
Disgusted 0.4%
Happy 0.3%

AWS Rekognition

Age 21-29
Gender Female, 97.7%
Calm 91.3%
Surprised 4.1%
Sad 3.9%
Confused 0.3%
Disgusted 0.2%
Angry 0.2%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 34-42
Gender Male, 96.6%
Confused 55.8%
Calm 20.1%
Sad 6.2%
Surprised 5.8%
Disgusted 5%
Angry 2.8%
Happy 2.8%
Fear 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Person 99%
Person 93.7%
Person 87%
Person 86.1%
Person 68.4%
Person 41.5%
Chair 94.8%

Categories

Imagga

interior objects 100%

Text analysis

Amazon

KODAL
EaA

Google

41 YT37A°2 XAGO
41
YT37A°2
XAGO