Human Generated Data

Title

Untitled (man and woman emerging from church with crowd)

Date

1952

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6297

Human Generated Data

Title

Untitled (man and woman emerging from church with crowd)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6297

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Clothing 99.6
Apparel 99.6
Tie 98.3
Accessories 98.3
Accessory 98.3
Person 98.2
Human 98.2
Person 97.4
Person 96.4
Person 94.9
Person 93.2
Person 89.9
Overcoat 89.7
Coat 89.7
Sleeve 84.6
Suit 83.3
Person 76.3
Plant 67.7
Long Sleeve 65.9
Face 62.7
Evening Dress 62.6
Fashion 62.6
Gown 62.6
Robe 62.6
Tuxedo 60.7
Suit 55.8
Person 53.1

Clarifai
created on 2023-10-26

people 99.6
monochrome 97.6
man 96.8
group 95.3
street 94.6
group together 94
adult 93.2
vehicle 91.7
coat 91.2
woman 90.9
transportation system 90.1
couple 90
leader 88.5
three 87.2
administration 85.9
airport 85.7
two 84
portrait 83.4
vehicle window 83.1
wedding 83

Imagga
created on 2022-01-22

man 33.6
male 26.9
people 26.8
person 23.6
business 22.5
adult 21.4
umbrella 16.5
men 16.3
car 16.2
suit 15.5
portrait 15.5
city 15
passenger 14.8
happy 13.8
automobile 13.4
office 13.4
groom 13.3
businessman 13.2
urban 13.1
smile 12.8
clothing 12.5
couple 12.2
black 12.1
professional 11.9
old 11.8
building 11.6
vehicle 11.6
standing 11.3
work 11
smiling 10.8
transportation 10.7
executive 10.6
life 10.5
lifestyle 10.1
outdoors 9.7
indoors 9.7
women 9.5
love 9.5
corporate 9.4
day 9.4
industry 9.4
casual 9.3
street 9.2
shelter 9.2
occupation 9.2
attractive 9.1
holding 9.1
cheerful 8.9
looking 8.8
driver 8.7
canopy 8.7
sitting 8.6
drive 8.5
worker 8.5
travel 8.4
room 8.4
mature 8.4
vintage 8.3
garment 8.3
human 8.2
job 8
working 7.9
jacket 7.8
hand 7.6
career 7.6
senior 7.5
world 7.5
dress 7.2
shop 7.2
handsome 7.1
interior 7.1

Google
created on 2022-01-22

Coat 86.2
Black-and-white 84.2
Style 83.8
Headgear 83.5
Hat 83
Suit 76.1
Blazer 73.2
Vintage clothing 72.2
Monochrome 71.5
Monochrome photography 71.4
Sun hat 68
Crew 67.7
Room 66.6
Fedora 62.6
Stock photography 62.1
History 59.9
Art 58.9
Smile 57.4
Uniform 57.1
Arch 56.4

Microsoft
created on 2022-01-22

person 99.3
text 96.9
clothing 93.8
black and white 89.7
man 78.2
standing 75.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 99.5%
Happy 59.9%
Calm 29.9%
Angry 3%
Surprised 2.3%
Confused 1.9%
Disgusted 1.9%
Sad 0.6%
Fear 0.5%

AWS Rekognition

Age 25-35
Gender Female, 94.5%
Calm 99.4%
Angry 0.2%
Happy 0.1%
Sad 0.1%
Disgusted 0.1%
Confused 0.1%
Surprised 0%
Fear 0%

AWS Rekognition

Age 37-45
Gender Male, 58.8%
Happy 76.7%
Calm 19.6%
Confused 1.1%
Sad 1.1%
Surprised 0.6%
Disgusted 0.3%
Fear 0.3%
Angry 0.2%

AWS Rekognition

Age 48-54
Gender Female, 51.8%
Calm 98.7%
Happy 0.5%
Sad 0.4%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 51-59
Gender Female, 67%
Calm 98.7%
Happy 0.5%
Disgusted 0.3%
Confused 0.2%
Sad 0.1%
Angry 0.1%
Surprised 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Tie 98.3%
Person 98.2%
Suit 83.3%

Categories

Text analysis

Amazon

YТ3А°-