Human Generated Data

Title

Untitled (people dressed up as king, queen, and court)

Date

c. 1950

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21844

Human Generated Data

Title

Untitled (people dressed up as king, queen, and court)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99.3
Human 99.3
Person 97.8
Person 96.1
Person 95.9
Person 95.3
Person 94.3
Person 94.2
Person 93.1
Person 91.9
Person 84
Person 82.9
Vehicle 79.7
Bike 79.7
Bicycle 79.7
Transportation 79.7
Person 79.6
Person 76.9
Crowd 76.5
Sculpture 74.9
Art 74.9
Stage 74.7
Statue 70.2
Flooring 61.4
Person 60.7
Figurine 58.6
Musician 57
Musical Instrument 57
Leisure Activities 56.7
People 55.9
Costume 55.2

Imagga
created on 2022-03-11

man 21.2
city 19.9
men 18
urban 17.5
people 16.7
musical instrument 14.8
building 14
business 14
travel 13.4
walking 13.2
adult 13
black 12.3
footwear 12
outdoors 11.9
architecture 11.9
wind instrument 11.9
male 11.4
boot 11.3
shop 11.2
women 11.1
street 11
day 10.2
person 9.7
group 8.9
clothing 8.9
life 8.8
foot 8.6
brass 8.4
park 8.2
style 8.2
new 8.1
history 8
working 7.9
businessman 7.9
world 7.9
sax 7.8
culture 7.7
crowd 7.7
shoes 7.7
outdoor 7.6
walk 7.6
shoe shop 7.6
legs 7.5
journey 7.5
dark 7.5
windowsill 7.5
window 7.4
shoe 7.4
design 7.3
square 7.2
color 7.2
lifestyle 7.2
activity 7.2
portrait 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

statue 92.8
window 90.7
person 80
clothing 66.7
white 65.3
black and white 61.6
altar 19.3

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Female, 65.7%
Happy 53.7%
Calm 41.3%
Sad 1.6%
Angry 1.1%
Disgusted 1.1%
Surprised 0.5%
Confused 0.4%
Fear 0.3%

AWS Rekognition

Age 48-56
Gender Male, 94.9%
Calm 43.3%
Disgusted 25.4%
Sad 10.3%
Confused 5.3%
Happy 4.4%
Angry 4.3%
Surprised 3.7%
Fear 3.2%

AWS Rekognition

Age 31-41
Gender Male, 99.4%
Happy 52.7%
Calm 36.7%
Confused 4.6%
Surprised 3.2%
Sad 1.4%
Disgusted 0.7%
Angry 0.4%
Fear 0.3%

AWS Rekognition

Age 31-41
Gender Male, 85.2%
Calm 68.9%
Sad 20.3%
Happy 4.5%
Surprised 2%
Disgusted 1.4%
Angry 1.3%
Confused 1%
Fear 0.6%

AWS Rekognition

Age 29-39
Gender Male, 76.5%
Sad 55.8%
Happy 27.2%
Disgusted 5.2%
Calm 4.8%
Angry 2.9%
Confused 1.8%
Surprised 1.4%
Fear 1%

AWS Rekognition

Age 50-58
Gender Male, 64.2%
Sad 51.4%
Calm 24%
Happy 13.9%
Disgusted 4.1%
Fear 2.3%
Confused 2.1%
Angry 1.6%
Surprised 0.6%

AWS Rekognition

Age 31-41
Gender Female, 95.7%
Calm 91.1%
Happy 5%
Sad 2.8%
Confused 0.4%
Disgusted 0.3%
Fear 0.2%
Surprised 0.2%
Angry 0.1%

AWS Rekognition

Age 23-33
Gender Female, 91.7%
Happy 94.3%
Surprised 1.3%
Calm 1.2%
Fear 1%
Disgusted 0.8%
Confused 0.6%
Sad 0.5%
Angry 0.3%

AWS Rekognition

Age 51-59
Gender Male, 98.4%
Sad 44.6%
Calm 32.1%
Disgusted 9.5%
Confused 4.9%
Happy 2.8%
Surprised 2.6%
Angry 2.2%
Fear 1.3%

AWS Rekognition

Age 35-43
Gender Male, 99.3%
Calm 79.1%
Happy 4.7%
Surprised 3.5%
Confused 3.5%
Disgusted 3.3%
Sad 2.8%
Fear 2.4%
Angry 0.7%

AWS Rekognition

Age 23-31
Gender Female, 100%
Happy 63.6%
Sad 14.4%
Angry 13.6%
Fear 2.4%
Disgusted 1.9%
Confused 1.4%
Surprised 1.4%
Calm 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Bicycle 79.7%

Captions

Microsoft

a person sitting in front of a window 49%
a person standing in front of a window 48.9%
a person standing next to a window 48.8%

Text analysis

Amazon

92361
YT33A2
MIIR YT33A2
D
MIIR

Google

7236.
7236.