Human Generated Data

Title

Untitled (children in graduation cap and gown, on stage)

Date

1949

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18277

Human Generated Data

Title

Untitled (children in graduation cap and gown, on stage)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1949

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Interior Design 100
Indoors 100
Person 98.5
Human 98.5
Room 98.3
Person 98
Auditorium 91.5
Theater 91.5
Hall 91.5
Person 90.9
Person 88.2
Person 85.9
Person 85.4
Stage 83.9
Person 83.1
Person 82.3
Person 81.1
Leisure Activities 72.4
Crowd 71.6
Person 68.6
Person 62.3
Person 62.2
Person 60.6
Poster 58.8
Advertisement 58.8
Living Room 58.3
Orchestra Pit 56.1
Flooring 55.7
Court 55.2
Person 54.1
Person 53.8
Person 47.5

Imagga
created on 2022-03-04

equipment 31.8
electronic equipment 23.1
business 22.5
blackboard 18.1
bank 17.9
money 17.9
finance 17.7
amplifier 17.3
device 16.9
financial 15.1
case 14.1
cash 13.7
currency 13.5
technology 13.4
paper 13.3
old 13.2
banking 12.9
black 11.4
office 10.4
savings 10.2
dollar 10.2
metal 9.6
bill 9.5
horizontal 9.2
music 9.1
close 9.1
retro 9
wealth 9
radio 8.6
musical instrument 8.5
showing 8.4
vintage 8.4
machine 8.1
register 7.9
shredder 7.8
folder 7.8
bills 7.8
sell 7.7
payment 7.7
card 7.6
audio 7.6
city 7.5
rich 7.4
economy 7.4
sale 7.4
object 7.3
digital 7.3
container 7.3
computer 7.3
steel 7.1
work 7.1
architecture 7

Google
created on 2022-03-04

Rectangle 87.4
Font 79.7
Watercraft 72.5
Monochrome photography 71.3
Monochrome 67.2
Display device 67.2
Event 66.3
Art 62.5
Room 60.8
History 60.3
Machine 58.3
Visual arts 56.8
Advertising 56.5

Microsoft
created on 2022-03-04

text 97.8
person 91.7
black 67
white 64.5
black and white 62.6
store 31.8

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 99.9%
Calm 86.5%
Sad 7.9%
Happy 4.1%
Confused 0.5%
Angry 0.3%
Surprised 0.3%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 50-58
Gender Male, 99.9%
Calm 78%
Sad 15.2%
Happy 3.9%
Confused 1.4%
Disgusted 0.5%
Angry 0.3%
Surprised 0.3%
Fear 0.3%

AWS Rekognition

Age 22-30
Gender Male, 78%
Sad 76.1%
Calm 19.5%
Fear 1.9%
Angry 1.1%
Confused 0.6%
Disgusted 0.3%
Happy 0.3%
Surprised 0.1%

AWS Rekognition

Age 30-40
Gender Female, 96.6%
Calm 55.9%
Sad 25.5%
Happy 10.7%
Fear 4.9%
Angry 1%
Confused 1%
Disgusted 0.7%
Surprised 0.3%

AWS Rekognition

Age 26-36
Gender Male, 83.9%
Calm 77.9%
Surprised 19.8%
Disgusted 0.6%
Sad 0.4%
Fear 0.4%
Confused 0.4%
Happy 0.3%
Angry 0.2%

AWS Rekognition

Age 21-29
Gender Female, 87.5%
Sad 63.8%
Calm 33.3%
Fear 0.9%
Disgusted 0.8%
Surprised 0.4%
Angry 0.3%
Happy 0.3%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 98.5%
Poster 58.8%

Captions

Microsoft

a group of people standing in front of a store 57.1%
a group of people in front of a store 54%
a group of people sitting in front of a store 38.8%

Text analysis

Amazon

THE
BETTER
JUNE
BEST
MA KE THE
KE
MA
TO
KO

Google

KE
MA KE THE BEFST YT37A2-
BEFST
MA
THE
YT37A2-