Human Generated Data

Title

Untitled (children in classroom led by nun)

Date

1949

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6245

Human Generated Data

Title

Untitled (children in classroom led by nun)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1949

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

School 99.7
Room 99.7
Indoors 99.7
Classroom 99.7
Person 99.3
Human 99.3
Person 98.4
Person 98.4
Person 95.2
Person 94.7
Person 92.8
Person 89.1
Person 88.8
Person 81
Person 80.1
Restaurant 78.1
Interior Design 72.3
Person 70.2
Cafeteria 67.2
Person 67.1
Person 65
Furniture 60.1
Person 49.3

Imagga
created on 2022-01-22

room 80.3
classroom 75.4
restaurant 45.2
interior 34.5
chair 29.1
table 28.8
hall 24.5
building 24.3
cafeteria 21.9
shop 21.6
dining 20
furniture 18.5
dinner 18.1
chairs 15.7
party 15.5
food 15.3
decor 15
indoors 14.9
service 14.8
modern 14.7
structure 14.1
design 14.1
business 14
decoration 13.7
indoor 13.7
hotel 13.4
drink 13.4
inside 12.9
people 12.8
tables 12.8
home 12.8
eat 12.6
house 12.5
mercantile establishment 12.3
office 12
style 11.9
counter 11.8
banquet 11.7
glass 11.7
elegant 11.1
meal 10.8
setting 10.6
empty 10.3
luxury 10.3
place 10.2
bar 10.2
architecture 10.2
stool 9.9
lunch 9.8
catering 9.8
celebration 9.6
light 9.4
center 9.3
floor 9.3
wedding 9.2
wood 9.2
reception 8.8
napkin 8.8
knife 8.6
work 8.6
fork 8.6
comfortable 8.6
barbershop 8.5
desk 8.5
contemporary 8.5
seat 8.3
city 8.3
place of business 8.2
kitchen 8.1
urban 7.9
flowers 7.8
scene 7.8
elegance 7.6
coffee 7.4
dish 7.2
lifestyle 7.2
holiday 7.2

Google
created on 2022-01-22

Black 89.6
Chair 86.4
Black-and-white 84.9
Style 83.8
Table 80.1
Monochrome photography 76.6
Monochrome 75.8
Snapshot 74.3
Office equipment 73.6
Event 71.4
Room 69.9
Font 66.7
Stock photography 62.3
Urban design 58.3
Houseplant 57.4
Class 57
Pattern 55.5
Desk 55.1
Rectangle 54.5
Building 52.1

Microsoft
created on 2022-01-22

text 99
person 85.6
black and white 77.5
people 70.1
building 64.6
shop 14.2

Face analysis

Amazon

AWS Rekognition

Age 18-26
Gender Male, 69.5%
Happy 30.2%
Angry 26.6%
Sad 16.1%
Calm 13.4%
Fear 6.9%
Disgusted 2.9%
Confused 2.6%
Surprised 1.4%

AWS Rekognition

Age 18-26
Gender Male, 92.5%
Sad 41.6%
Fear 37.8%
Calm 17.3%
Happy 1.1%
Confused 0.8%
Disgusted 0.7%
Angry 0.6%
Surprised 0.3%

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a group of people sitting at a table 75.9%
a group of people sitting in a restaurant 75.8%
a group of people in a restaurant 75.7%

Text analysis

Amazon

the
would
Number
hello
S
mode
Number I the mode it the - S
should
is
-
to
KEL
should spall
it
spall
I
YT37A2-YA

Google

YTヨヨA2
A2
YT