Human Generated Data

Title

Untitled (bride walking up steps to church with helpers carrying dress train)

Date

1955

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9550

Human Generated Data

Title

Untitled (bride walking up steps to church with helpers carrying dress train)

People

Artist: Martin Schweig, American 20th century

Date

1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9550

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Person 99.5
Human 99.5
Person 99.1
Person 99.1
Person 98.1
Car 96
Transportation 96
Vehicle 96
Automobile 96
Person 95.8
Person 95.3
Person 94.9
Clothing 94.4
Apparel 94.4
Person 88.5
Person 88.2
Art 86.7
Person 82.1
Sculpture 66.2
People 64.7
City 61.6
Building 61.6
Town 61.6
Urban 61.6
Drawing 61
Statue 59.8
Female 59.6
Overcoat 59.5
Coat 59.5
Flooring 59.1
Tarmac 58.6
Asphalt 58.6
Pedestrian 57.3
Road 57

Clarifai
created on 2023-10-27

people 99.8
group 97.9
many 96.1
adult 96.1
monochrome 95.2
street 93.4
administration 90.6
one 90.1
wear 90
group together 88.4
woman 87.6
man 86.6
two 81.7
furniture 81.3
mammal 80.7
vehicle 79.8
several 78.8
art 75.6
transportation system 73.3
child 70

Imagga
created on 2022-01-28

man 26.9
people 25.6
person 24.6
adult 21.4
male 21.3
business 20
office 16.6
working 15.9
work 15.7
sitting 15.5
cemetery 14.9
businessman 14.1
computer 13.8
newspaper 13.2
laptop 12.8
city 12.5
clothing 12.1
building 11.4
men 11.2
suit 11.1
worker 10.7
old 10.4
corporate 10.3
architecture 10.2
scholar 10.1
life 10.1
lifestyle 10.1
occupation 10.1
religion 9.9
outdoors 9.7
color 9.5
outside 9.4
robe 9.4
hand 9.1
intellectual 9.1
technology 8.9
group 8.9
job 8.8
indoors 8.8
world 8.8
urban 8.7
education 8.7
product 8.6
face 8.5
groom 8.4
portrait 8.4
outdoor 8.4
looking 8
couple 7.8
travel 7.7
professional 7.7
casual 7.6
executive 7.5
one 7.5
covering 7.4
20s 7.3
alone 7.3
shop 7.3
women 7.1
garment 7
room 7

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

text 99.8
black and white 93
street 74.6
statue 68.9
monochrome 65.3
shop 16.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Female, 95.5%
Calm 60.4%
Happy 19.5%
Sad 9.9%
Angry 2.6%
Disgusted 2.6%
Confused 2.3%
Fear 1.7%
Surprised 1%

AWS Rekognition

Age 38-46
Gender Male, 88.4%
Happy 56.7%
Calm 17.6%
Fear 16.4%
Surprised 3.1%
Disgusted 2.1%
Sad 1.7%
Angry 1.3%
Confused 1.1%

AWS Rekognition

Age 29-39
Gender Male, 79.1%
Sad 59.4%
Calm 22.5%
Confused 7.3%
Surprised 4%
Happy 2.3%
Disgusted 1.7%
Angry 1.5%
Fear 1.4%

AWS Rekognition

Age 31-41
Gender Male, 98%
Disgusted 64.5%
Happy 15.5%
Calm 10.6%
Sad 3.9%
Confused 2.1%
Surprised 1.7%
Angry 1.3%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person
Car
Person 99.5%
Person 99.1%
Person 99.1%
Person 98.1%
Person 95.8%
Person 95.3%
Person 94.9%
Person 88.5%
Person 88.2%
Person 82.1%
Car 96%

Categories

Text analysis

Amazon

20012
KODAK-A-EITW

Google

MJ7--YT37A°2 - - XAGO
MJ7--YT37A°2
-
XAGO