Human Generated Data

Title

Untitled (woman playing piano, others listening)

Date

1950

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20238

Human Generated Data

Title

Untitled (woman playing piano, others listening)

People

Artist: Peter James Studio, American

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.7
Human 99.7
Person 99.1
Furniture 98.3
Person 93
Chair 92.3
Person 91.3
Couch 89.5
Interior Design 87.3
Indoors 87.3
Restaurant 87.3
Room 85.8
Person 83.8
Meal 83.4
Food 83.4
Person 83.4
Chair 83
Sitting 81.2
Cafeteria 78.5
Living Room 78.4
Flooring 72.7
Clothing 67.2
Apparel 67.2
People 65.6
Crowd 64.9
Table 64.8
Reception 64.3
Coat 59.9
Suit 59.9
Overcoat 59.9
Reception Room 58.3
Waiting Room 58.3
Cafe 58
Floor 57.2
Audience 55.5

Imagga
created on 2022-03-05

interior 61
restaurant 54.8
room 52.6
chair 47.8
table 44
counter 40.4
house 39.3
furniture 38.6
home 37.5
modern 36.5
building 35.7
decor 31.8
barbershop 31.7
floor 30.7
design 29.8
shop 28.5
indoors 27.2
inside 25.8
dining 25.7
indoor 25.6
window 25
structure 24.7
architecture 24.2
wood 21.7
luxury 21.4
kitchen 20.6
mercantile establishment 20.4
decoration 20.3
light 19.4
glass 18.7
empty 18
apartment 17.2
contemporary 16.9
style 16.3
office 16.3
chairs 15.7
seat 15.6
sofa 15.3
comfortable 15.3
lamp 15.3
living 14.2
place of business 14
3d 13.9
bar 13.9
dinner 13.5
tables 12.8
business 12.8
relax 12.6
residential 12.4
elegance 11.8
food 11.6
cafeteria 11.1
wall 11.1
nobody 10.9
stool 10.9
sitting 10.3
relaxation 10
drink 10
stylish 10
carpet 9.7
hotel 9.5
render 9.5
estate 9.5
people 9.5
man 9.4
lifestyle 9.4
place 9.3
space 9.3
domestic 9
oven 8.8
desk 8.8
living room 8.8
lunch 8.7
lighting 8.7
work 8.6
decorate 8.6
elegant 8.6
classroom 8.4
eat 8.4
coffee 8.3
plant 8.2
door 8.2
cabinet 8.1
life 8.1
person 8
cabinets 7.9
women 7.9
wooden 7.9
refrigerator 7.9
rug 7.9
stove 7.9
diner 7.8
residence 7.8
couch 7.7
windows 7.7
tile 7.6
hall 7.6
fashion 7.5
center 7.5
service 7.4
meal 7.3
establishment 7.1

Google
created on 2022-03-05

Picture frame 94.7
Table 91.8
Window 89.6
Building 89.3
Chair 89.2
Couch 87.6
Standing 86.4
Black-and-white 85
Interior design 85
Style 83.9
Door 79.4
Monochrome 77
Monochrome photography 76.3
Houseplant 75.6
Coffee table 72.3
Plant 71.6
Event 68.8
Room 68.3
Sitting 67.4
Art 67

Microsoft
created on 2022-03-05

indoor 96.4
furniture 91.2
clothing 90.8
person 87.9
window 87.4
table 86.9
man 78.1
house 78
text 77.7
black and white 77.4
chair 64.9

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Female, 76.2%
Sad 52.4%
Calm 36.7%
Happy 6.8%
Angry 1.9%
Surprised 0.6%
Confused 0.6%
Disgusted 0.5%
Fear 0.4%

AWS Rekognition

Age 31-41
Gender Female, 61.3%
Calm 77.6%
Sad 7.5%
Happy 5.9%
Angry 2.8%
Surprised 1.8%
Confused 1.7%
Disgusted 1.5%
Fear 1.2%

AWS Rekognition

Age 27-37
Gender Male, 97%
Calm 71.3%
Sad 27.3%
Confused 0.5%
Fear 0.3%
Happy 0.2%
Angry 0.1%
Surprised 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Chair 92.3%

Captions

Microsoft

a group of people standing in front of a window 84.6%
a group of people in a room 84.5%
a group of people in front of a window 83.2%

Text analysis

Amazon

II
ЬНИ
РИССО SABREWS ЬНИ
SABREWS
РИССО