Human Generated Data

Title

Untitled (cocktail party on terrace overlooking lake)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17260

Human Generated Data

Title

Untitled (cocktail party on terrace overlooking lake)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17260

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.5
Apparel 99.5
Person 98.8
Human 98.8
Person 98.2
Person 98.1
Person 98.1
Person 97.3
Person 96.1
Plant 88
Dress 85.5
Robe 84.9
Fashion 84.9
Suit 83.9
Overcoat 83.9
Coat 83.9
Female 82.2
Gown 81.9
Person 80.1
Wedding 75.2
Church 72.9
Architecture 72.9
Building 72.9
Chair 68.2
Furniture 68.2
Flower 66
Blossom 66
Person 65
Bridegroom 62.7
Face 62.3
Woman 62.3
Meal 62.1
Food 62.1
Wedding Gown 61
Tree 60.8
Portrait 60.3
Photography 60.3
Photo 60.3
Priest 59.6
Flower Arrangement 58.3
Altar 58.3
Girl 58.1
Tuxedo 57.4
Evening Dress 55.9

Clarifai
created on 2023-10-29

people 99.7
group 97.7
adult 96.7
woman 96.6
group together 95
man 94.2
wedding 92.4
child 90.7
administration 90.5
wear 89.3
several 87
home 82
monochrome 81.5
many 80.6
recreation 80.6
leader 80.4
portrait 80.1
four 77.8
facial expression 77.5
outfit 77.3

Imagga
created on 2022-02-26

man 19.5
world 18.5
person 18.2
male 17
silhouette 16.5
architecture 15.6
old 15.3
people 15.1
sky 14
scene 13.8
lights 13
symbol 12.8
astronaut 12.8
businessman 12.4
crowd 11.5
boss 11.5
travel 11.3
flag 11.2
landscape 11.2
clouds 11
city 10.8
president 10.8
sunset 10.8
cheering 10.8
nighttime 10.8
audience 10.7
work 10.4
business 10.3
town 10.2
occupation 10.1
tree 10
stadium 9.9
supporters 9.9
team 9.9
speech 9.8
building 9.8
leader 9.6
patriotic 9.6
nation 9.5
presentation 9.3
vivid 9.3
teamwork 9.3
businesswoman 9.1
sun 8.9
job 8.8
sexy 8.8
vibrant 8.8
urban 8.7
bright 8.6
historic 8.2
landmark 8.1
religion 8.1
history 8
trees 8
icon 7.9
grass 7.9
ancient 7.8
stone 7.8
light 7.6
historical 7.5
religious 7.5
park 7.4
countryside 7.3
mountain 7.1

Google
created on 2022-02-26

Tree 83.7
Hat 83.4
Adaptation 79.3
Sleeve 77.6
Monochrome photography 73.2
Monochrome 71.9
Vintage clothing 70.4
Uniform 69.9
Font 69.4
Crew 68.9
Event 66.7
Room 66.2
Stock photography 64.6
History 64.4
Suit 62.5
Illustration 59.3
Photo caption 58.2
Classic 57.2
Team 54.5
Workwear 52.9

Microsoft
created on 2022-02-26

clothing 96.7
person 91.5
text 90.8
outdoor 89.4
wedding dress 84.3
man 79.7
woman 78.9
black and white 76.6
bride 63.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Female, 51.5%
Calm 82.5%
Surprised 11.8%
Sad 3.3%
Fear 0.8%
Angry 0.5%
Confused 0.5%
Happy 0.4%
Disgusted 0.2%

AWS Rekognition

Age 41-49
Gender Female, 97.3%
Calm 85.5%
Sad 5.9%
Fear 2.4%
Happy 2.1%
Surprised 1.7%
Disgusted 1.2%
Angry 0.8%
Confused 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.8%
Person 98.2%
Person 98.1%
Person 98.1%
Person 97.3%
Person 96.1%
Person 80.1%
Person 65%

Categories

Text analysis

Amazon

YT37A2-
YT37A2- "Agox
"Agox