Human Generated Data

Title

Untitled (two brides posed with their bridesmaids and flower girl on stage decorated with foliage)

Date

1952

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9384

Human Generated Data

Title

Untitled (two brides posed with their bridesmaids and flower girl on stage decorated with foliage)

People

Artist: Martin Schweig, American 20th century

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 98.5
Human 98.5
Person 94
Person 93.3
Person 88
Art 86.3
Apparel 82.6
Clothing 82.6
Person 81
People 75.4
Person 72.3
Person 72.2
Drawing 70.9
Person 68.9
Person 66.4
Painting 65.6
Person 63.8

Imagga
created on 2022-01-23

architecture 26.6
building 26.3
night 24.9
structure 23.8
city 20.8
billboard 18.6
urban 17.5
tower 16.1
signboard 15.3
sky 15.3
landscape 14.9
light 14
scene 13.8
tree 13.8
cityscape 13.2
shop 12.9
old 12.5
travel 12
landmark 11.7
business 11.5
skyline 11.4
bridge 11.3
water 11.3
grunge 11.1
house 11
river 10.7
park 10.6
skyscraper 10.6
barbershop 10.4
finance 10.1
blackboard 10.1
history 9.8
art 9.7
reflection 9.6
lights 9.3
street 9.2
silhouette 9.1
scenery 9
design 9
mercantile establishment 9
financial 8.9
scenic 8.8
forest 8.7
downtown 8.6
winter 8.5
vintage 8.3
historic 8.2
currency 8.1
drawing 7.8
bank 7.8
construction 7.7
bill 7.6
dark 7.5
snow 7.5
evening 7.5
tourism 7.4
office 7.3
cash 7.3
hall 7.3
graphic 7.3
trees 7.1
modern 7

Google
created on 2022-01-23

Plant 83.9
Font 83.5
Art 80.3
Painting 75.4
Monochrome 74
Event 71.6
Room 71.3
Decoration 70.2
Rectangle 69.1
Monochrome photography 68
Visual arts 67.1
Symmetry 65.9
Illustration 65
Drawing 65
Curtain 61.6
Picture frame 60.5
History 59.9
Arch 58.7
Pattern 57.6
Holy places 57.4

Microsoft
created on 2022-01-23

text 98
old 68.1
drawing 62.6

Face analysis

Amazon

AWS Rekognition

Age 28-38
Gender Male, 97.4%
Calm 54.6%
Confused 11.4%
Happy 10.6%
Sad 10.3%
Disgusted 6.7%
Fear 2.6%
Angry 2%
Surprised 1.8%

AWS Rekognition

Age 41-49
Gender Male, 99.3%
Confused 60%
Calm 15.9%
Sad 8.9%
Happy 5%
Surprised 3.5%
Disgusted 3.2%
Fear 2.1%
Angry 1.3%

AWS Rekognition

Age 24-34
Gender Male, 87.8%
Happy 76.9%
Sad 17.4%
Calm 3.6%
Fear 0.6%
Surprised 0.5%
Disgusted 0.4%
Confused 0.3%
Angry 0.3%

AWS Rekognition

Age 34-42
Gender Male, 53.3%
Confused 38.8%
Calm 26.8%
Sad 14.1%
Disgusted 5.8%
Happy 5.7%
Surprised 4.1%
Fear 2.6%
Angry 2.1%

AWS Rekognition

Age 39-47
Gender Male, 66.4%
Happy 57%
Sad 12.9%
Angry 7.6%
Disgusted 5.8%
Confused 5.1%
Surprised 4.4%
Fear 3.7%
Calm 3.5%

AWS Rekognition

Age 30-40
Gender Male, 99.7%
Confused 40.1%
Sad 21.1%
Happy 12.9%
Disgusted 12.1%
Calm 8.7%
Surprised 2.2%
Angry 1.7%
Fear 1.2%

AWS Rekognition

Age 21-29
Gender Male, 96%
Surprised 46.7%
Confused 32.6%
Happy 15.8%
Sad 2.8%
Calm 1.3%
Disgusted 0.3%
Fear 0.3%
Angry 0.3%

AWS Rekognition

Age 29-39
Gender Male, 87.5%
Sad 63.8%
Confused 23.8%
Calm 6.7%
Happy 3%
Fear 0.9%
Disgusted 0.7%
Angry 0.6%
Surprised 0.6%

Feature analysis

Amazon

Person 98.5%
Painting 65.6%

Captions

Microsoft

an old photo of a person 65.9%
a group of people posing for a photo 47%
an old photo of a group of people posing for the camera 46.9%

Text analysis

Amazon

B
11 B
11
EA
sild
weese
DAK-

Google

YT37A8 XA
YT37A8
XA