Human Generated Data

Title

Untitled (people reading in library)

Date

1948

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20131

Human Generated Data

Title

Untitled (people reading in library)

People

Artist: Peter James Studio, American

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20131

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.9
Human 98.9
Person 98.5
Person 98.1
Clinic 97.6
Chair 96.6
Furniture 96.6
Person 93.9
Hospital 93.5
Person 93.3
Operating Theatre 72.6
Laptop 61.1
Electronics 61.1
Computer 61.1
Pc 61.1
Lab 55.6
Chair 52.8

Clarifai
created on 2023-10-22

people 99.8
furniture 99.2
group together 99.1
group 98.8
room 98.5
adult 98.4
chair 97.3
table 96.6
man 95.9
indoors 95.6
sit 94.9
administration 94.8
several 93
desk 92.6
leader 91.5
home 91.2
seat 91
woman 90.8
employee 89.9
monochrome 89

Imagga
created on 2022-03-05

room 70.7
interior 62.9
barbershop 58.3
table 54.3
chair 53.9
restaurant 49.2
shop 45.6
furniture 44.8
modern 35.8
mercantile establishment 35.5
house 33.5
decor 32.7
floor 31.6
design 31
cafeteria 30.1
home 29.5
structure 28.3
classroom 26.9
wood 26.7
building 26.1
dining 25.7
indoors 24.6
kitchen 24.4
seat 24
place of business 24
comfortable 22.9
window 22.9
empty 22.4
style 22.3
inside 22.1
dinner 21.1
indoor 21
luxury 20.6
decoration 20.3
contemporary 19.8
glass 19.5
light 19.4
architecture 18.8
office 18.6
area 18.3
lamp 17.2
patio 16.8
chairs 16.7
apartment 16.3
elegance 16
tables 15.8
food 15.7
wall 15.5
residential 15.3
bar 14.8
nobody 14.8
stool 13.8
lunch 13.7
eat 13.4
drink 13.4
service 13
center 12.8
hotel 12.4
living 12.3
meal 12.2
desk 12.1
establishment 12.1
space 11.6
plant 11.2
party 11.2
counter 11.2
drawer 10.9
stylish 10.9
3d 10.9
cabinet 10.8
setting 10.6
decorate 10.5
business 10.3
place 10.3
stove 10.1
relaxation 10.1
oven 9.8
vase 9.7
hall 9.7
urban 9.6
sofa 9.6
tile 9.5
cook 9.2
cabinets 8.9
refrigerator 8.9
lifestyle 8.7
scene 8.7
render 8.7
coffee 8.3
wine 8.3
computer 8.3
domestic 8.1
wooden 7.9
work 7.9
sink 7.9
napkin 7.8
residence 7.8
lighting 7.7
plate 7.6
relax 7.6
lights 7.4
banquet 7.2
life 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

indoor 95.7
text 94.4
window 92.4
chair 86.9
house 86.4
black and white 82.6
white 65.8
desk 61.5
table 27.5
furniture 18.5
several 13.9
dining table 6.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Female, 72.6%
Happy 94.8%
Calm 4.4%
Surprised 0.3%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%
Fear 0.1%
Sad 0.1%

AWS Rekognition

Age 21-29
Gender Male, 60.6%
Calm 70%
Sad 14.5%
Surprised 8.5%
Fear 3.5%
Confused 1.4%
Angry 1.2%
Disgusted 0.5%
Happy 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person
Chair
Laptop
Person 98.9%
Person 98.5%
Person 98.1%
Person 93.9%
Person 93.3%
Chair 96.6%
Chair 52.8%
Laptop 61.1%

Categories

Text analysis

Amazon

se
GUTGOIRG