Human Generated Data

Title

Untitled (seven teenage students reading books in school library)

Date

1952

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9388

Human Generated Data

Title

Untitled (seven teenage students reading books in school library)

People

Artist: Martin Schweig, American 20th century

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9388

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.3
Human 99.3
Person 99.1
Person 98.3
Person 95.7
Chair 95.4
Furniture 95.4
Home Decor 95.3
Person 93.9
Person 93.3
Room 91.9
Indoors 91.9
Library 84.4
Book 84.4
Nature 71.7
Flooring 71.6
Person 71.1
Outdoors 71
Clothing 62.3
Apparel 62.3
Bookcase 60.3
Meal 58.8
Food 58.8
Shelf 57.5
Floor 55.3
Chair 53

Clarifai
created on 2023-10-26

people 99.8
adult 98.5
furniture 98.2
group together 98.1
group 97
man 96.7
room 94.4
woman 93.8
monochrome 91.7
several 91.5
two 90.4
many 89.8
education 89.1
indoors 88.5
leader 87.9
administration 87.4
sit 86.5
chair 85
child 84.6
three 84.6

Imagga
created on 2022-01-23

room 94.5
classroom 74.5
interior 53.1
table 45.5
chair 40.8
modern 34.4
furniture 32.4
house 30.1
building 29.7
restaurant 29.5
home 27.1
floor 26
office 25.8
cafeteria 24
structure 23.9
decor 23.9
wood 22.5
design 22.5
indoors 22
kitchen 20.4
dining 20
inside 19.3
indoor 19.2
library 16.7
architecture 16.6
business 16.4
glass 16.3
empty 16.3
hall 15.9
apartment 15.3
contemporary 15.1
work 14.2
people 14
window 13.9
women 13.5
residential 13.4
3d 13.2
desk 13
group 12.9
food 12.7
chairs 12.7
light 12.7
counter 12.6
decoration 12.3
male 12.1
sofa 11.5
lamp 11.4
meeting 11.3
sitting 11.2
luxury 11.2
style 11.1
wall 11.1
drink 10.9
stove 10.8
oven 10.8
team 10.8
conference 10.8
businessman 10.6
seat 10.5
comfortable 10.5
plant 10.5
living 10.4
render 10.4
teamwork 10.2
cabinets 9.9
stool 9.9
refrigerator 9.9
wooden 9.7
together 9.6
urban 9.6
men 9.5
man 9.4
lifestyle 9.4
executive 9.3
dinner 9.3
bar 9.2
person 9.2
relaxation 9.2
cook 9.2
success 8.9
area 8.7
day 8.6
corporate 8.6
decorate 8.6
tile 8.6
center 8.6
space 8.5
eat 8.4
laptop 8.2
happy 8.1
cabinet 7.9
class 7.7
hotel 7.6
patio 7.6
elegance 7.6
city 7.5
manager 7.5
businesswoman 7.3
domestic 7.2
computer 7.2
school 7.1
job 7.1
steel 7.1

Google
created on 2022-01-23

Window 93.3
Shelf 87.5
Interior design 84.8
Black-and-white 84.6
Chair 83.5
Building 81.6
Bookcase 80.8
Art 78.5
Monochrome photography 75.8
Font 75.8
Monochrome 74.8
Shelving 73.4
Room 72.1
Machine 67.7
Visual arts 61.6
Door 60.8
Sitting 58.1
Illustration 56.5
Collection 54.8
Child 54.3

Microsoft
created on 2022-01-23

text 98.4
furniture 98.3
table 94
house 90.3
chair 89.4
indoor 85.4
clothing 84
person 81.9
black and white 76.6
room 44.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 98.1%
Calm 88.2%
Surprised 7.8%
Disgusted 0.9%
Confused 0.8%
Sad 0.7%
Happy 0.7%
Angry 0.5%
Fear 0.5%

AWS Rekognition

Age 20-28
Gender Female, 93.2%
Calm 99.2%
Sad 0.6%
Disgusted 0.1%
Happy 0%
Confused 0%
Fear 0%
Angry 0%
Surprised 0%

AWS Rekognition

Age 37-45
Gender Male, 92.8%
Sad 52.3%
Calm 35.6%
Happy 5.1%
Confused 2.7%
Angry 1.9%
Disgusted 1.1%
Surprised 0.8%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.3%
Chair 95.4%

Text analysis

Amazon

veces

Google

s a a a p
s
a
p