Human Generated Data

Title

Untitled (nativity scene display with carolers and crowd)

Date

1953

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6353

Human Generated Data

Title

Untitled (nativity scene display with carolers and crowd)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6353

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.5
Human 99.5
Person 99.4
Person 96.8
Person 95.9
Clothing 92.3
Apparel 92.3
Person 90.7
Person 82.5
Outdoors 75.6
Urban 72.6
Female 72.2
Plant 70.6
Dress 70.6
Furniture 69
Person 67.6
Nature 67.1
Coat 62
Overcoat 62
Suit 62
Building 61.9
Girl 60.1
Horse 57.9
Animal 57.9
Mammal 57.9
Crowd 57.7
Tree 57.6
Neighborhood 56.5
Person 56.1
Face 55.8
Person 43.9

Clarifai
created on 2023-10-26

people 100
many 99.8
group 99.2
group together 97.8
adult 97.1
man 96
street 95.8
woman 95.7
child 92.8
church 92.2
monochrome 91.1
religion 90.7
leader 90.4
ceremony 90
crowd 89.5
one 89.3
war 85.1
several 83.7
funeral 83.4
administration 82.5

Imagga
created on 2022-01-22

cemetery 65.7
city 42.4
architecture 37.5
building 31.4
travel 23.9
sky 23.6
skyline 22.8
cityscape 18.9
structure 18.8
urban 18.4
tower 17.9
tourism 17.3
landmark 17.2
skyscraper 15.8
street 15.6
buildings 15.1
office 14.5
fountain 13.3
old 13.2
town 13
history 12.5
river 12.5
new 12.1
monument 12.1
famous 11.2
water 10.7
university 10.6
downtown 10.6
tourist 10.4
tree 10.1
center 9.9
park 9.9
modern 9.8
skyscrapers 9.8
high 9.5
scene 9.5
tall 9.4
historic 9.2
stone 9.1
column 8.9
trees 8.9
aerial 8.7
ancient 8.6
color 8.3
palace 8.2
statue 8.1
light 8
black 7.8
people 7.8
landscape 7.4
exterior 7.4
business 7.3
religion 7.2

Google
created on 2022-01-22

Photograph 94.3
Building 93.2
Black 90
Black-and-white 86.7
Style 84
Monochrome photography 75.4
Monochrome 74.5
Snapshot 74.3
City 72
Rectangle 71.4
Sky 71.3
Palm tree 70.1
Art 68.8
Suit 68.5
Event 68.3
Room 67.8
Vintage clothing 65.8
History 65.4
Street 64.9
Stock photography 64.9

Microsoft
created on 2022-01-22

building 99.1
text 97.5
funeral 81.4
black and white 79
people 76.3
person 69.5
grave 68.1
cemetery 62.1
several 10

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Male, 53.6%
Calm 22.7%
Sad 21.4%
Happy 20.1%
Disgusted 15.7%
Confused 9.8%
Angry 5.8%
Surprised 3.3%
Fear 1.2%

AWS Rekognition

Age 25-35
Gender Male, 96.6%
Calm 63.6%
Sad 24.6%
Confused 8.3%
Angry 1%
Surprised 0.8%
Disgusted 0.7%
Fear 0.6%
Happy 0.4%

AWS Rekognition

Age 10-18
Gender Female, 61.7%
Calm 79.6%
Sad 11.8%
Confused 2.3%
Happy 1.7%
Surprised 1.4%
Fear 1.2%
Angry 1.1%
Disgusted 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Horse 57.9%

Categories

Text analysis

Amazon

mas
RIES:MEATS
GD
PutChristBack
sos

Google

RIES
MEATS
Imas
YT37A°2-XAGON RIES MEATS Put Chris Imas
YT37A°2-XAGON
Put
Chris