Human Generated Data

Title

Untitled (bride and groom in rain)

Date

1956

People

Artist: Bachrach Studios, founded 1868

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18973

Human Generated Data

Title

Untitled (bride and groom in rain)

People

Artist: Bachrach Studios, founded 1868

Date

1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18973

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.7
Human 99.7
Person 99.2
Pedestrian 97.8
Person 97.7
Road 96.3
Tarmac 96.2
Asphalt 96.2
Clothing 94.7
Apparel 94.7
Bag 89.5
Street 89.3
City 89.3
Building 89.3
Urban 89.3
Town 89.3
Path 86.9
Wheel 86.3
Machine 86.3
Sedan 85.5
Vehicle 85.5
Transportation 85.5
Automobile 85.5
Female 84.5
Walking 82.9
Shorts 81.9
Shoe 79.7
Footwear 79.7
Car 79.6
Woman 67.8
Airfield 65
Airport 65
Hand 64.2
Dress 62.6
Car 62.4
Portrait 62.1
Face 62.1
Photography 62.1
Photo 62.1
Backpack 61.2
People 60.4
Intersection 56.8
Kid 56.7
Child 56.7
Coat 56.6
Floor 55.3
Suit 55.1
Overcoat 55.1
Person 46.8

Clarifai
created on 2023-10-22

people 99.9
transportation system 98.5
vehicle 98
two 97.6
woman 96.8
group together 96.7
street 96.7
adult 96.4
group 96.1
airport 95.9
man 94
three 93.3
wait 91
wear 89.3
monochrome 88.7
child 88
many 84.5
recreation 84
several 83.9
road 83.3

Imagga
created on 2022-03-05

intersection 34.4
aircraft carrier 27.6
street 25.8
warship 23
vehicle 21.8
sky 21.7
travel 21.1
military vehicle 21.1
ship 20.4
road 19.9
car 18.5
transportation 17.9
landscape 17.1
city 16.6
airport 15.8
water 14.7
clouds 14.4
aircraft 14.2
people 13.9
transport 13.7
airplane 13.4
outdoor 13
summer 12.9
plane 12.6
ocean 12.4
vessel 12.1
speed 11.9
sea 11.7
sidewalk 11.7
walking 11.4
urban 11.4
sand 11.3
beach 11
journey 10.3
tourism 9.9
coast 9.9
vacation 9.8
tourist 9.8
aviation 9.8
jet 9.5
traffic 9.5
empty 9.4
man 9.4
outdoors 9.2
craft 9.2
sport 9.1
building 8.9
landing 8.8
pavement 8.8
highway 8.7
flight 8.6
day 8.6
cloud 8.6
person 8.2
activity 8.1
motor vehicle 8
line 8
scenic 7.9
business 7.9
track 7.8
high 7.8
scene 7.8
adult 7.8
men 7.7
concrete 7.7
shore 7.4
town 7.4
park 7.4
air 7.4
black 7.3
mountain 7.3
sunset 7.2

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

street 97.4
vehicle 94.1
black and white 94
outdoor 93.8
car 92.8
text 88.7
land vehicle 88
clothing 80.2
person 71.9
monochrome 64.1
people 62.5
footwear 62.5
way 40.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 91.9%
Calm 82.1%
Happy 9%
Sad 6.7%
Confused 1%
Angry 0.4%
Disgusted 0.3%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 16-24
Gender Male, 99.7%
Calm 99.1%
Happy 0.6%
Sad 0.1%
Confused 0.1%
Disgusted 0%
Surprised 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Wheel
Shoe
Car
Person 99.7%
Person 99.2%
Person 97.7%
Person 46.8%
Wheel 86.3%
Shoe 79.7%
Car 79.6%
Car 62.4%

Categories

Text analysis

Amazon

39
YY33A
YY33A ADO
of
ADO